Cox Communications is doubling the download speed of its most popular residential Internet service (Preferred) in Northern Virginia and increasing the speed of its Premier service by thirty-three percent.
PowerBoost, which supplies temporary "burst" bandwidth for uploads, also has been added to the Preferred and Premier packages. PowerBoost for downloads has been available since 2007. This is the fifth consecutive year that Cox has enhanced the speed of its Internet services in northern Virginia (credit Verizon's FiOS service for that).
Verizon has boosted FiOS downstream speeds to 50 Mbps, with 20 Mbps upstream, for its top package, available everywhere FiOS is sold.
Cox customers will get the speed increases automatically in July, without need for a call or technician visit.
The PowerBoost feature means uses of the Preferred package will experience speeds up to 12.5 Mbps down/2.5 Mbps up. Premier customers can achieve 25 Mbps down/3.5 Mbps up.
Policy advocates often complain about the U.S. "broadband problem." Sometimes they mean it isn't available, isn't fast enough or costs too much. The evidence suggests availability isn't a problem. Whether a service is "fast enough" is a matter of interpretation, but I don't see evidence of anything but increasing speeds, often for the same cost. "Price" likewise is an issue.
With the exception of Japan and South Korea, though, cost per Mbps in the United States is quite comparable to nearly all other leading nations.
Complaining about broadband is a bit like similar observations we could easily have made about wireless penetration or use of text messaging, where U.S. users lagged way behind European users for quite some time. That "problem" doesn't exist anymore.
Neither will the "broadband" problem. Have there been issues with availability and speed? Yes. Are those problems in the process of resolution? Yes. Pointing to the existence of problems is fine. Ignoring clear evidence that problems rapidly are being fixed is either misinformed, intellectually dishonest or sloppy.
Some people like to say the definition of broadband is a problem, pointing to data collection that defines "broadband"--at minimum--as 200 kbps. That is wrong, also. The FCC recently changed its minimum definition to 768 kbps. A couple of points.
The only definition the global telecom industry ever has formally set was way back when ISDN was created. Broadband still formally is defined as any bit rate over "voice" rates of 64 kbps. So 128 kbps "traditionally" has been considered "broadband."
Market have moving definitions. But you can hardly fault the FCC for initially setting a minimum standard that is in fact above the recognized global nomenclature. In recent practice, industry executives might have considered broadband to be 1.544 Mbps or above, while anything between 64 kbps and 1.544 Mbps is "wideband."
All that is meaningless. It will be even more meaningless when cable operators start branding some broadband speeds as "wideband," to suggest it is more bandwidth than "broadband." Markets may like that. But it doesn't change the only formal definition the global engineering community ever has embraced.
Also, "minimum" is one thing. "Maximum" or "mean" are other things. Megabit access now is the norm. Targets will continue to shift higher over time. Call it the broadband version of grade inflation. The minimum "passing" grade might be a "D." That doesn't mean people expect that to be the norm.
The United States once had a major "broadband" availability problem. It no longer has. There are places where "access" by wire remains a problem. Most of those places have satellite alternatives, though. And many places have fixed wireless access as well.
Honestly, most potential users have one or two wired networks to choose from, two satellite providers and two or three mobile providers. Many consumers soon will be able to choose from as many as five mobile broadband providers.
Under-supply won't be an issue for most, much longer. Over-supply is the looming problem.
Thursday, June 26, 2008
Broadband "Over-supply" Problem
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Post Comments (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...
No comments:
Post a Comment