Friday, July 7, 2017

How Do You Measure the Size of the Artificial Intelligence Market?

It is very hard at the moment to define with much precision the “size of the artificial intelligence market,” for some of the same reasons that it is hard to quantify the size of “e-commerce” or mobile payments or banking.

Are we measuring the value of transactions, the incremental revenue generated by new marketplaces, the value of new software, hardware or services to create, install or support such transactions?

Depending on how we approach quantifying “the market,” it is possible to come up with big--but almost useless numbers.

With artificial intelligence, the cleanest way to measure might be the sales value of hardware and software that creates the AI capabilities. But that is a problem, as well. Most of AI value is not derived by “selling” the capability, but by using the capability to support something else that actually represents a current line of business activity.

So AI adds value by boosting the productivity of some other process or product. AI is not so much a “big new product” as a feature of a cloud computing service, analytics capability or an enabler of a product’s features (voice recognition for home computing appliances).

In other words, if AI, or machine learning, is a feature of a computing operation or process, most of the actual value will be embedded in some other product, and will remain difficult to measure. The direct, incremental and “new” markets are mostly software products that provide AI functions. And that is a difficult figure to create.

Indeed, the AI “outcomes” are mostly of an intangible nature: what is happening? What should I do?

At the moment, most of the actual AI activity is acquisitions of small software firms able to supply the AI functions.



No comments:

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...