It is not easy to figure out what the next generation of computing will look like, or who will lead it.
What is clear is that computing moves through eras that have been defined by the archetypical machines in each era: mainframes, followed by mini-computers, then personal computers (first stand alone and then connected to local area networks, then to the internet), then mobiles. Now we are nearing an era where machine-to-machine apps or connected consumer devices might be the defining devices.
It also is fair to note that a focus on the archetypical “devices” might miss the shift as seen through the applications, business models and purposes computing supports. In the mainframe era, computing supported enterprise business or large organization purposes. In the mini-computer era, computing tools spread to organizations and entities of smaller size. In the PC era small business and then computers began using computing devices.
In what we might call the internet era, computing shifted away from enterprises and has largely been driven by consumer apps, new business models and roles, growing in pervasiveness, going mobile or untethered (ambient) and increasingly becoming embedded in consumer apps and life.
Facebook and Google, for example, have become computing leaders whose revenue models are based on advertising. Amazon is a computing leader whose revenue model is based, in part, on retailing.
Also, computing increasingly has become something that is remote, distributed and connected, as cloud computing increasingly shows with more “core” computing handled on a remote device, not locally resident, as was the original pattern.
So far, no clear and universally accepted term defines the recent evolutions of computing. In fact, it is becoming very hard to clearly delineate where computing ends and communications begins. Once upon a time “computing” tallied money spent on “computers and software,” as well as services supporting users of computers.
These days, one has to talk about internet applications and activities, smartphones and “connected life” to understand how, why and where core computing happens.
But a next generation will come, and the way it comes might make irrelevant the terms we use to understand and track “computing.” We certainly do not track the “electricity-using” appliances “industry,” but we do track electricity generation and delivery.
That might some day happen to “computing” as well. There will be some firms we track on parts of the computing business (data centers, semiconductors, enterprise and consumer app suppliers, support services). But large parts of the “computing” industry might be tracked in other categories, such as media or commerce.
It seems clear that a change in devices that use computing offers hints, as does the increasingly distributed nature of computing, which implies that “communications” will underpin future computing in a pervasive and fundamental sense. That is why “cloud computing” (now conducted in large and mega-scale data centers, perhaps in the future also conducted at edge locations) is of such interest to “communications” professionals.
It also seems clear it is getting harder to define "computing" or "information technology" as clearly as once seemed possible.