Cox Communications is doubling the download speed of its most popular residential Internet service (Preferred) in Northern Virginia and increasing the speed of its Premier service by thirty-three percent.
PowerBoost, which supplies temporary "burst" bandwidth for uploads, also has been added to the Preferred and Premier packages. PowerBoost for downloads has been available since 2007. This is the fifth consecutive year that Cox has enhanced the speed of its Internet services in northern Virginia (credit Verizon's FiOS service for that).
Verizon has boosted FiOS downstream speeds to 50 Mbps, with 20 Mbps upstream, for its top package, available everywhere FiOS is sold.
Cox customers will get the speed increases automatically in July, without need for a call or technician visit.
The PowerBoost feature means uses of the Preferred package will experience speeds up to 12.5 Mbps down/2.5 Mbps up. Premier customers can achieve 25 Mbps down/3.5 Mbps up.
Policy advocates often complain about the U.S. "broadband problem." Sometimes they mean it isn't available, isn't fast enough or costs too much. The evidence suggests availability isn't a problem. Whether a service is "fast enough" is a matter of interpretation, but I don't see evidence of anything but increasing speeds, often for the same cost. "Price" likewise is an issue.
With the exception of Japan and South Korea, though, cost per Mbps in the United States is quite comparable to nearly all other leading nations.
Complaining about broadband is a bit like similar observations we could easily have made about wireless penetration or use of text messaging, where U.S. users lagged way behind European users for quite some time. That "problem" doesn't exist anymore.
Neither will the "broadband" problem. Have there been issues with availability and speed? Yes. Are those problems in the process of resolution? Yes. Pointing to the existence of problems is fine. Ignoring clear evidence that problems rapidly are being fixed is either misinformed, intellectually dishonest or sloppy.
Some people like to say the definition of broadband is a problem, pointing to data collection that defines "broadband"--at minimum--as 200 kbps. That is wrong, also. The FCC recently changed its minimum definition to 768 kbps. A couple of points.
The only definition the global telecom industry ever has formally set was way back when ISDN was created. Broadband still formally is defined as any bit rate over "voice" rates of 64 kbps. So 128 kbps "traditionally" has been considered "broadband."
Market have moving definitions. But you can hardly fault the FCC for initially setting a minimum standard that is in fact above the recognized global nomenclature. In recent practice, industry executives might have considered broadband to be 1.544 Mbps or above, while anything between 64 kbps and 1.544 Mbps is "wideband."
All that is meaningless. It will be even more meaningless when cable operators start branding some broadband speeds as "wideband," to suggest it is more bandwidth than "broadband." Markets may like that. But it doesn't change the only formal definition the global engineering community ever has embraced.
Also, "minimum" is one thing. "Maximum" or "mean" are other things. Megabit access now is the norm. Targets will continue to shift higher over time. Call it the broadband version of grade inflation. The minimum "passing" grade might be a "D." That doesn't mean people expect that to be the norm.
The United States once had a major "broadband" availability problem. It no longer has. There are places where "access" by wire remains a problem. Most of those places have satellite alternatives, though. And many places have fixed wireless access as well.
Honestly, most potential users have one or two wired networks to choose from, two satellite providers and two or three mobile providers. Many consumers soon will be able to choose from as many as five mobile broadband providers.
Under-supply won't be an issue for most, much longer. Over-supply is the looming problem.
Thursday, June 26, 2008
Broadband "Over-supply" Problem
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Compute Remotely, Assemble Locally
There's an obvious relationship between cloud computing, "over the top" applications, open networks, open devices, importance of application program interfaces.
The ability to compute, store data and execute code remotely means it is more affordable than ever for small developers and individuals to create applications that are immediately available to users anywhere. The existence of those applications "on the Web" makes the Web a more-powerful platform for bringing applications of any sort to market. That puts business pressure on walled garden business models of all sorts.
The existence of cloud computing also means software is becoming unbundled from hardware to a large extent. Not completely unbundled; not unbundled for every application or service. In fact, some apps require tight integration to execute with the greatest elegance. But the direction is more in the direction of how people use PCs than how they consume cable television.
The application explosion, built on open platforms and APIs, also means new applications can be built on the shoulders of existing apps and applets. Assembling apps begins to be a process akin to what one does with Legos, to oversimplify.
That also means apps more often are created globally, assembled locally. That has implications for browsers, networks and protocols. To assemble apps locally means a premium for rapid response. If assembled apps are to mimic the feel of locally-stored apps, response time is a crucial requirement. This requires more than big, fast pipes. It means browsers that are much faster than we have used in the past. It means a computing architecture that does not require so much traversing of wide area networks to grab app elements.
The issue is to answer a question: “How do I pair together one customer that’s CPU-intensive and another that’s IO-intensive and have the sum appear just like a single, well performing application?”.
There is lots of room for innovation here. And lots of distance to cover. But it's coming, even if most users only gradually are being exposed to use of remote and locally-assembled apps.
The ability to compute, store data and execute code remotely means it is more affordable than ever for small developers and individuals to create applications that are immediately available to users anywhere. The existence of those applications "on the Web" makes the Web a more-powerful platform for bringing applications of any sort to market. That puts business pressure on walled garden business models of all sorts.
The existence of cloud computing also means software is becoming unbundled from hardware to a large extent. Not completely unbundled; not unbundled for every application or service. In fact, some apps require tight integration to execute with the greatest elegance. But the direction is more in the direction of how people use PCs than how they consume cable television.
The application explosion, built on open platforms and APIs, also means new applications can be built on the shoulders of existing apps and applets. Assembling apps begins to be a process akin to what one does with Legos, to oversimplify.
That also means apps more often are created globally, assembled locally. That has implications for browsers, networks and protocols. To assemble apps locally means a premium for rapid response. If assembled apps are to mimic the feel of locally-stored apps, response time is a crucial requirement. This requires more than big, fast pipes. It means browsers that are much faster than we have used in the past. It means a computing architecture that does not require so much traversing of wide area networks to grab app elements.
The issue is to answer a question: “How do I pair together one customer that’s CPU-intensive and another that’s IO-intensive and have the sum appear just like a single, well performing application?”.
There is lots of room for innovation here. And lots of distance to cover. But it's coming, even if most users only gradually are being exposed to use of remote and locally-assembled apps.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Wednesday, June 25, 2008
Where is the Network API?
At the recent "Rethinking Communications" conference, several panelists commented about the difficulty of creating anything like an application program interface to "the network." APIs are a common way to hide the details of any application or function from software developers. The idea is to compartmentalize functions enough that a developer doesn't have to know how everything works; only what is necessary to invoke some function or operation, or add some function.
Right now the problem is that the "network" is full of subsystems that aren't actually unified enough to present a single API to any third party developer. IP Multimedia Subsystem will help, and right now Session Initiation Protocol comes as close as anything to being an API, though the analogy is rough.
The other issue: programmers, almost by nature, will stress test the limits of any network demarcation a network wishes to expose. "Give them an inch; they'll take a mile," Trevor Baca, Jaduka VP, says.
That isn't likely to raise comfort levels on the carrier side. But some middle ground has to be reached if carriers are to benefit from skills third party developers can put to work.
Right now the problem is that the "network" is full of subsystems that aren't actually unified enough to present a single API to any third party developer. IP Multimedia Subsystem will help, and right now Session Initiation Protocol comes as close as anything to being an API, though the analogy is rough.
The other issue: programmers, almost by nature, will stress test the limits of any network demarcation a network wishes to expose. "Give them an inch; they'll take a mile," Trevor Baca, Jaduka VP, says.
That isn't likely to raise comfort levels on the carrier side. But some middle ground has to be reached if carriers are to benefit from skills third party developers can put to work.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Cox Ups Speed in Phoenix
Cox Communications is bumping up the speeds of its cable modem service in Phoenix. For customers with Cox's Preferred service, downloads will move from 7 Mbps per second to 9 Mbps, with upload speeds increasing from 512 kilobytes per second to 768 kbps.
For the Premier service, customers will get 15 Mbps with burst of up to 20 Mbps for download speeds with uploads starting at 1.5 Mbps and capable of bursts up to 2 Mbps.
The additional speed comes at no additional cost. Qwest Communications is upping its digital subscriber line service to 12 Mbps for its lower-cost service and 20 Mbps for its higher-cost service.
Still, there are some who argue the United States is "falling behind" other nations, suffering from inadequate supply, high prices, slow speeds, or all of the above. One can argue about that.
One cannot argue the problem is not being addressed. Speeds keep climbing, for the same amount of money, everyplace telcos and cable compete with each other.
For the Premier service, customers will get 15 Mbps with burst of up to 20 Mbps for download speeds with uploads starting at 1.5 Mbps and capable of bursts up to 2 Mbps.
The additional speed comes at no additional cost. Qwest Communications is upping its digital subscriber line service to 12 Mbps for its lower-cost service and 20 Mbps for its higher-cost service.
Still, there are some who argue the United States is "falling behind" other nations, suffering from inadequate supply, high prices, slow speeds, or all of the above. One can argue about that.
One cannot argue the problem is not being addressed. Speeds keep climbing, for the same amount of money, everyplace telcos and cable compete with each other.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
iPod Still Top Seller, Store Personnel Report
In a recent Tickermine survey of stores selling MP3 players in June 2008, including Best Buy, Radio Shack and Circuit City, the iPod emerged as the best-selling MP3 player by 82 percent of those polled.
Microsoft's Zune 80 GB was said to be best selling by 12 percent of respondents.The SanDisk Sansa Clip 2GB was said to be the best seller by six percent of respondents.
Some 62 percent of respondents say a dedicated music player is a better choice than a music-capble phyone, but 38 percent reported they preferred music-capable mobile phones because it means one less item in your pocket to contend with.
Microsoft's Zune 80 GB was said to be best selling by 12 percent of respondents.The SanDisk Sansa Clip 2GB was said to be the best seller by six percent of respondents.
Some 62 percent of respondents say a dedicated music player is a better choice than a music-capble phyone, but 38 percent reported they preferred music-capable mobile phones because it means one less item in your pocket to contend with.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Dell to Become a Managed Service Provider
Dell plans to launch a managed services initiative aimed at allowing channel partners and Dell itself to remotely maintain and troubleshoot small business networks, servers and desktops. Two recent Dell acquisitions -- Everdream and Silverback Technologies -- will provide the foundation for Dell's SaaS push, says Joe Panettieri is Editorial Director of Nine Lives Media.
The managed services push is supposed to happen late in 2008 or early in 2009. Dell intends to become a Master Managed Service Provider (Master MSP), which means IT consulting firms will be able to leverage Dell's own network operation centers (NOCs) to manage customer networks.
When even hardware manufacturers become service providers, how long can service providers wait to become data specialists, to a greater or lesser degree?
The managed services push is supposed to happen late in 2008 or early in 2009. Dell intends to become a Master Managed Service Provider (Master MSP), which means IT consulting firms will be able to leverage Dell's own network operation centers (NOCs) to manage customer networks.
When even hardware manufacturers become service providers, how long can service providers wait to become data specialists, to a greater or lesser degree?
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Android Phone from T-Mobile: 4th Quarter
When you finally can buy an Android phone, it first will be available from T-Mobile, in all likelihood. T-Mobile is said to want a device to sell in the fourth quarter.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Net AI Sustainability Footprint Might be Lower, Even if Data Center Footprint is Higher
Nobody knows yet whether higher energy consumption to support artificial intelligence compute operations will ultimately be offset by lower ...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...