Friday, March 8, 2019

Bandwidth Charges Might Drive Edge Computing

Eventually, we might find that avoiding local access bottlenecks and high bandwidth bills is as big a driver of edge computing as the need for ultra-low latency processing speed, even if low latency always is said to be the driver for edge computing.

“If you need to analyze a large amount of data, your internet connection might not be able to cope with the data flow, and it would result in your inability to extract real value from data,” says Riccardo Di Blasio.

In other cases, WAN bandwidth charges might be the issue. “If you are an oil and gas company which is drilling in Angola and requires computing, today the alternatives are to either build your own data centers like in the 90s, (with all the cost, and scale limitations associated with) or to use a cloud provider (where the nearest data center will be probably in the UAE or South Africa, at least 5,000 miles away) with enormous costs and pretty lousy SLAs,” says Di Blasio.

By 2025, almost 20 percent of data created will be real-time in nature, rather than being sent to the core of the network for processing, says B.S. Teh, Seagate SVP. That is the sort of use case that benefits from local processing at the edge, or close to where the data actually is generated.
“We see edge as a big driver of growth,” he said. Some growing use cases, such as video surveillance will benefit from edge processing, sometimes not because latency is always so important, but simply to avoid wide area network transport costs.
The top reason for using edge computing is ultra-low latency. But there are other drivers, including use cases where processing at the edge alleviates the need to move bulk data generated by bandwidth intensive apps across the wide area network.
Wikibon compared the three-year management and processing costs of a cloud-only solution using AWS IoT services compared with an edge-plus-cloud solution, to support cameras, security sensors, sensors on the wind-turbines and access sensors for all employee physical access points at a remote wind farm.
At a distance of 200 miles between the wind-farm and the cloud, and with an assumed 95 percent reduction in traffic from using the edge computing capabilities, the total cost is reduced from about $81,000 to $29,000 over three years. The cost of edge-plus-cloud computing is about a third the cost of a cloud-only approach, Wikibon estimated.

No comments:

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...