How do you price a legal chatbot?

Cover Image by <a href="https://pixabay.com/users/mohamed_hassan-5229782/?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=7325332">Mohamed Hassan</a> from <a href="https://pixabay.com//?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=7325332">Pixabay</a>

I used to think #chatbots are lame. You type stuff in and pray the “AI” understood what you said. If you knew what you want, you’re not going to ask nicely. Just give me a button or a dropdown!

With #GenAI like #ChatGPT, the odds of hitting the mark with your program have improved. I explored making a legal chatbot early on but I hit a dead-end when I couldn’t figure out what the use case is.

Even if I did figure out a use case, the next problem if I wanted to take it to the next level is pricing. How do you charge someone for using your legal chatbot?

Luckily for me, somebody else appeared to have walked the same path as me.

A few weeks ago, I talked about “Scott”, a legal chatbot which performs augmented data retrieval over cases in Singapore. I tried it at an early stage. Suffice to say, I am not its target audience, and I wondered aloud whether it was worth it.

I am still glad it graduated from “Beta”, and it now has a pricing model too.

A pricing chart for Scott, a litigation assistant that works like a Chatbot.

If you read past the marketing shtick, there is only one price for this chatbot — $720 for 6 months, aka $120 per month. As a comparison, if you don’t want fancy schmancy AI, a single user would pay SAL $85 a month for a boolean search engine over roughly the same materials.

I found the idea of limiting the output of a chatbot to be very interesting. A simple way to illustrate is to tell the user to cough up more dough for the GPT-4 answer rather than the GPT-3.5 answer. It might be tough to get a customer to understand how a GPT-4 answer is different from a GPT-3.5 answer. I haven’t seen anyone try this yet, although I feel it is technically doable.

As a lawyer though, I am kinda hesitant about giving a “half” or “flawed” product to my users. If the answer do suck, would I be liable for giving it? This smells like litigation waiting around the corner.

Another issue in my mind is how to entice others to try it. Chatbots are hardly used by most people for legal information, let alone used effectively to demonstrate its value. A “freemium” model is probably imperative in this regard. A 7 day “try it yourself” window is probably not sufficient for most people to evaluate the product though. Furthermore, if I need something specific from a chatbot, I am going to go in, get what I want and get out before paying.

For now, my thinking goes like this:

Given that skills are a important differentiating factor, this has to be built into the fundamental design of the chatbot.

Disclaimer: Obviously, I haven’t tried this out in the wild and the idea of a legal chatbot is still mostly in the drawing boards. So, I would want to keep working on this in my spare time.

Author Portrait Love.Law.Robots. – A blog by Ang Hou Fu