Just a few graphics that confirm what you probably already believe is happening, namely that humans using artificial intelligence in relatively direct ways (as compared to the indirect ways as when AI aids some other process) keeps growing.
source: Seeking Alpha, edge-ai-vision
For most consumers, the most-common direct use is the AI chatbot.
source: Business Insider, Seeking Alpha
As always is to be expected, there will be many more startups than are sustainable long term, and bigger apps will tend to acquire smaller apps, leading to some consolidation.
And, compared to internet apps, which in many cases were relatively affordable to create, AI models are prodigiously expensive. And that will ultimately favor deep-pocketed firms with access to lots of capital.
And, generally speaking, the more capable language models become, the more money it takes to train them. As we move towards agentic AI, more of the cost should shift to inference and “action on inference” operations, though.
On the other hand, the cost of using AI to make inferences keeps dropping, which means it will be used more often, further fueling usage.
No comments:
Post a Comment