Monday, November 18, 2019

Digital Realty Platform Aims for Zero Gravity

Digital Realty’s new PlatformDigital architecture essentially tries to ride two opposing trends, data gravity that centralizes processing and storage, at the same time new requirements for local processing at the edge to counteract data gravity.  

The objective is a computing infrastructure matched to business needs irrespective of data center size, scale, location, configuration or ecosystem interconnections, Digital Realty says. 

In essence, the new platform attempts to manage data gravity by creating a compute fabric that uses centralized and decentralized computing as required. 

Gravity and dark matter (dark energy) might now be analogies for information technology forces, concentrating and attracting on one hand, repelling and distributing on the other. 

Data gravity is the notion that data becomes harder to move as its scale and volume increases. The implication is that processing and apps move to where the data resides. 

As the Law of Gravity states that the attraction between objects is directly proportional to their weight or mass, so big data is said also to tend to attract applications and services. 

But we might be overstating the data gravity argument, as perhaps it is the availability of affordable processing or storage at scale that attracts the data, not vice versa. But as in the known universe gravity concentrates, dark matter is seen as pushing the universe to expand. 

At the same time, scale and performance requirements seem also to be migrating closer to the places where apps are used, at the edge, either for reasons of end user experience (performance), the cost of moving data, security and governance reasons, the cost of processing or scope effects (ability to wring more value from the same data). 

Some might call this a form of “anti-gravity” or “dark energy” or “zero gravity” at work, where processing happens not only at remote big data centers but also importantly locally, on the device, on the premises, in the metro area, distributing data stores and processing. 

"Gartner predicts that by 2022, 60 percent of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers,” Digital Reality says. 

It remains to be seen how computing architecture evolves. In principle, either data gravity or zero gravity could develop. In fact, some of us might argue zero gravity is counterbalanced by the  likely emergence of edge computing as key trends. 


Zero gravity might be said to be a scenario where processing happens so efficiently wherever it is needed that gravitational pull, no matter what causes it, is nearly zero. In other words, processing and storage grow everywhere, at the same time, at the edge and center. 

A better way of imagining the architecture might be local versus remote, as, in principle, even a hyperscale data center sits at the network edge. We seem to be heading back towards a balance of remote and local, centralized and decentralized. Big data arguably pushes to decentralization while micro or localized functions tend to create demand for edge and local processing. 

No comments:

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...