ChatGPT’s release resulted in an explosion of interest in AI.
It has companies asking how to take advantage of AI, and more specifically, the Large Language Models that enable AI like ChatGPT.
Assuming a company answers the “how” question, the next question is “can” we?
More precisely: “can we afford this?”
It’s a valid question. (See this CNBC article from the weekend.)
Training AI and Large Language Models relies on data – massive amounts of it – and the more data used, the better quality AI insights. Why?
Because:
- More Data = Better Models
- Better Models = Better Insights
Here’s the catch.
The tools typically used to train AI and Large Language Models charge based on data volume.
So if the goal is to train a Large Language Model that is fine-tuned to your specific organization, or to train and retrain it as you gather newer, better data – and your costs are directly proportional to the data volume used – you may quickly reach a point where “good enough” AI insights are all you can afford.
Your priorities – training quality AI and controlling costs – are pulling in opposite directions.
Need a refresher on AI terminology? We’ve got you.
So what should you do?
How to Build a Cost-Effective AI Strategy
If your goal is to use AI for a Natural Language Processing task – such as analyzing thousands of customer interaction transcripts – you have a few choices to make.
One of your most basic requirements would be a Large Language Model. That’s really the (artificial) intelligence powering any downstream tasks.
Then, estimate the cost associated with using the Large Language Model you’re considering.
The clearest example of this is Microsoft’s recent partnership with OpenAI, which has Microsoft offering companies access to GPT (the Large Language Model behind ChatGPT) for a fee.
And whether we’re talking about Microsoft/OpenAI or some other Large Language Model, that fee is based on consumption – measured in documents processed or compute hours.
“When I talk to my AI friends at the startup conferences, this is what I tell them: Do not solely depend on OpenAI, ChatGPT or any other large language models,” said Suman Kanuganti, founder of personal.ai, a chatbot currently in beta mode. “Because businesses shift, they are all owned by big tech companies, right? If they cut access, you’re gone.”
Your second consideration is what mechanism you use to fine-tune your model. That same Large Language Model that you’ve chosen as your starting point – it knows the vast content of the internet. It does not, however, know content unique to your organization (think: emails, surveys, conversations, forms, etc.). You need to fine-tune it yourself.
To fine-tune, you’ll probably consider a solution like Amazon SageMaker or Google AutoML.
These solutions allow your developer resources to engage in AI tasks, without necessarily being AI specialists.
But they charge similarly to Large Language Models – based on documents processed or compute hours.
You can see how these costs balloon and eventually the investment required to create the best model outpaces the value you might hope to realize (which could be saved time, saved churn, increased revenue or increased productivity).
AI That Pays for Itself Requires a Different Plan
Recognizing that companies need the best insights to make the investment worthwhile, and that those insights can’t cost more than the value they deliver, Pienso thinks differently.
Pienso is architected differently.
Pienso is priced differently.
Stop Paying for Large Language Models
Rather than tapping into a pay-as-you-go Large Language Model via API and having to pass along those fees to our customers, Pienso hosts the Large Language Model in our platform.
Each instance of Pienso ships with Large Language Models embedded.
Pienso uses open source versions of the leading Large Language Models, such as GPT and BERT, and natively embeds them within the platform itself, without having to incur costs via an API.
UI for AI – That Costs Less and Does More
Rather than provide a User Interface (UI) that disguises a 3rd party pay-as-you-go AI tool and charges you for usage, Pienso is the AI and UI in one.
Every component of the platform is purpose-built for Natural Language Processing – by Pienso – for Pienso customers. That gives us the flexibility to offer pricing that is not tied to the way traditional AI tools charge users.
Pricing Should Incentivize Experimentation
With the freedom afforded by Pienso’s architecture and IP, how do we price our AI capabilities?
We price based on outcomes – what you care about – not the process required to get there.
Interested in having a conversation? Get in touch at dan@pienso.com.
The post AI Should Pay for Itself. Until It Can, What’s the Point? appeared first on Pienso.