Wednesday, November 28, 2007
Verizon Wireless Takes Reasonable Gamble
One might argue that Verizon Wireless is gambling with its whole business model in allowing use of technically-compliant devices and software on its network next year. But one can point to the experience of wireless operators in Europe, who have used this "open" model for years, to see it is not so dangerous.
In fact, Verizon gains more than it might potentially lose, just about any way you want to spin the matter. First off, it gets great press for breaking the "closed" mobile model on a voluntary basis. Also, it is betting, likely reasonably, that the overwhelming mass of buyers still will prefer the old model of "discounted phone, two-year contract."
Verizon also uses the CDMA platform, which already means there is less handset choice than possible on a GSM network, since the GSM market is so much larger, globally. Verizon just might stimulate a bit more handset and software choice by going open.
Also, open is inevitable. The 700 MHz spectrum requires such device and software openness, so it is coming to the market, in any case. Verizon might as well "look good" rather than resisting the inevitable.
Open also means Verizon has a shot at creating a more robust developer community, a helpful asset indeed as more innovation moves to the software realm.
There's very little, if any, downside and lots of upside. Not since AT&T launched its "Digital One Rate" has any leading mobile provider taken a step that will reverberate throughout the whole industry. Sometimes, innovation is not just something small companies pull off. Sometimes very large companies do it as well. And maybe, sometimes, only a very-large company can cause a major change. On occasion, innovation may require the push only a very-dominant firm can supply. This appears to be such a case.
Labels:
open networks,
unlocked phones,
Verizon Wireless
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Tuesday, November 27, 2007
$2.4 Billion CLEC Decision Near
Sometime between now and Dec. 5th, the Federal Communications Commission is slated to make decisions that could significantly raise wholesale access and transport tariffs in six markets, including Boston, New York, Philadelphia, Pittsburgh, Providence, and Virginia Beach.
Customers can anticipate an additional $2.4 billion in extra charges for communications services, according to a study by QSI Consulting, if the rules are relaxed.
Basically, Verizon argues that market competition in each of the six markets is equivalent to that found in the Omaha, Neb. market, the benchmark used by the Federal Communications Commission to deregulate wholesale access rules and rates that have been favorable to competitors.
Up to this point, competitors in the six markets have been able to buy wholesale access and transport at rates below “retail” special access rates. Should Verizon prevail, it would be free to raise prices as it sees fit, with the likely result that wholesale rates would rise to just about what the retail special access rates are.
QSI estimates increased telecommunications expenses incurred by consumers for retail mass market, enterprise, and broadband access services would be $1.054 million, $747 million, and $565 million.. This amounts to a rate increase of $114 annually for an average household, QSI says.
Users in New York would wind up paying as much as $1.4 billion extra. In Philadelphia costs could rise $345 million; $380 million in Boston; $104 million in Virginia Beach and $177 million in Pittsburgh.
Consumers would wind up paying as much as $1 billion more for services; enterprises $751 million and broadband access users $565 million.
Opponents of the plan tend to think they have done what is needed to make the FCC commissioners aware of how woefully undeveloped access competition is in the six markets. But one never knows.
“The concern is that though the numbers are clear, there are media issues also on commissioner minds,” says Covad VP Angela Simpson. The danger is that the forbearance issue might wind up being a bargaining chip as commissioners grapple with the broader media deregulation issues.
Customers can anticipate an additional $2.4 billion in extra charges for communications services, according to a study by QSI Consulting, if the rules are relaxed.
Basically, Verizon argues that market competition in each of the six markets is equivalent to that found in the Omaha, Neb. market, the benchmark used by the Federal Communications Commission to deregulate wholesale access rules and rates that have been favorable to competitors.
Up to this point, competitors in the six markets have been able to buy wholesale access and transport at rates below “retail” special access rates. Should Verizon prevail, it would be free to raise prices as it sees fit, with the likely result that wholesale rates would rise to just about what the retail special access rates are.
QSI estimates increased telecommunications expenses incurred by consumers for retail mass market, enterprise, and broadband access services would be $1.054 million, $747 million, and $565 million.. This amounts to a rate increase of $114 annually for an average household, QSI says.
Users in New York would wind up paying as much as $1.4 billion extra. In Philadelphia costs could rise $345 million; $380 million in Boston; $104 million in Virginia Beach and $177 million in Pittsburgh.
Consumers would wind up paying as much as $1 billion more for services; enterprises $751 million and broadband access users $565 million.
Opponents of the plan tend to think they have done what is needed to make the FCC commissioners aware of how woefully undeveloped access competition is in the six markets. But one never knows.
“The concern is that though the numbers are clear, there are media issues also on commissioner minds,” says Covad VP Angela Simpson. The danger is that the forbearance issue might wind up being a bargaining chip as commissioners grapple with the broader media deregulation issues.
Labels:
Angela Simpson,
CLEC,
Covad,
FCC,
UNE
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Metro Ethernet, Optical Access: Still Far to Go
In the enterprise high-capacity access markets, one has to distinguish between the financial and operating markets. Of late there has been renewed interest in the financial value of scarce optical assets, particularly in smaller markets.
But the allocation of new capital to the access business, if welcome, is not the same thing as deployment of capital to support alternate optical access facilities to the places most businesses are located, which is, simply, in the larger markets.
There is no “silver bullet” in the optical access market; just determined, steady, slow progress in lighting new buildings with at least one fiber cable. To be sure, global carriers very much want to connect large enterprise locations with 1 Gigabit-per-second to 10 Gbps optical connections.
The problem sometimes is that such connections don’t exist, or sometimes simply that sourcing such facilities is laborious because there are so many small providers in local markets. The problem for a global carrier is simply the need to source really high bandwidth access all over the world, easily.
In part, it’s a Layer One issue. In the U.S. market, for example, only 12 percent of business sites have fiber connectivity. Only 20 percent of North American cell sites have fiber connectivity.
That explains the continuing attraction wireless and Ethernet-over-copper alternatives represent. To be sure, programs such as Verizon’s FiOS will solve those problems for consumers, and almost incidentally for many branch office, small office or smaller business executives.
In the second quarter, for example, Cogent Communications added 1,208 on-net connections, up 53.5 percent from the 787 added in the first quarter. In the third quarter Cogent added 30 buildings and expects to have added 100 on-net buildings by the end of the year.
The company expects to do so again in 2008, adding 100 new buildings to its network.
“As of September 30, 2007, we had 1,189 buildings directly connected to the network, representing over 520 million square feet of rentable office space, out of an addressable inventory in North America of about 6.2 billion square feet,” says Dave Schaeffer, Cogent CEO.
“We are currently utilizing a little bit less than 22 percent of the lit capacity in our network,” says Schaeffer, illustrating the issue nicely: fiber isn’t the problem, access to customers with fiber is an issue.
At the end of June Time Warner Telecom had 7,884 buildings connected on its own facilities. At the end of September the company had 8,109 buildings on network, an increase of about 225 buildings, or about three percent. On an annual basis, on-network buildings increased about 19 percent.
RCN has something in excess of 800 buildings on network. Optimum Lightpath has about 2,500 buildings on network with fiber connections.
Nationwide, there are some 95,000 fiber-fed buildings, says GeoResults. And of course, compounding the problem is the fact that lots of the fiber access to lit buildings is in a common cable sheath, no matter who the retailer of record is. For many desirable buildings, the issue is that most of the suppliers actually use fiber in the same cable sheath.
There is progress. It simply is progress of the persistent, gradual sort.
The point is to separate the legitimate financial plays—rolling up and aggregating optical access assets in tertiary markets, such as Zayo Bandwidth is doing, from the operating situation, which continues to be that optical connections to more buildings is the gate.
One would think optical connections to wireless towers are an obvious, slam dunk sort of opportunity. With broadband demands growing rapidly, and locations so easy to identify, replacement of copper-fed T1 or microwave connections, the typical solution these days, would seem to be a fairly easy business proposition.
There are perhaps 2.2 million wireless base station sites globally, including 250,000 in North America alone. Assume half those base stations use wireless backhaul, while the other half use leased T1s or optical connections.
The Chinese market is unusual in the sense that most of China Mobile’s base stations already are fiber connected. Observers tend to note that in Europe, the Middle East or African markets, it wouldn’t be unusual to find that 60 percent of connections use microwave technology while 25 percent use optical connections and just 15 percent or so are based on copper E1 connections.
In the U.S. market, perhaps 10 percent to 20 percent of towers and other transmitting locations use fiber connections, accounting for 25,000 to 50,000 optical backhaul locations. And though microwave backhaul is popular in other markets, it rarely is used in the U.S. market.
That suggests as many as 225,000 wireless tower sites, or as few as 200,000, are fed by T1 connections over copper media. Depending on which carrier is involved, backhaul can represent 20 to 40 percent of recurring operating cost.
Verizon and at&t obviously are in position to use their other assets to slice this cost of doing business, while Sprint Nextel and T-Mobile obviously face higher costs. But the fiber access opportunity isn’t necessarily contingent on replacement of copper-fed T1s with optical replacements.
Indeed, voice works pretty well when the backhaul is based on T1 technology, so carriers might well not want to complicate their operations by moving all that traffic over to optical access. It might in fact make just as much sense, or more sense, to use the optical facilities for the rapidly-growing IP traffic demands, leaving T1 facilities in place for voice.
In other words, use the Time Division Multiplex network for voice traffic that is highly sensitive to latency, and use optical Ethernet for bursty data traffic. Of course, thinking is bound to change once any appreciable amount of usage and revenue is generated by video.
At some point, optical will be the best choice. The issue is when that will happen, and what the optimal choices are in the meantime. The point is that optical Ethernet, though the long-term answer, doesn’t cleanly address all the operational issues carriers think they face.
Encapsulating TDM traffic for Ethernet transmission worries carrier technologists for any number of reasons, for example.
The bottom line is that optical Ethernet, and business optical access, continues to grow every quarter. It just isn’t the sort of transformation that can happen much faster, given the need to balance revenue from the first customer account with the cost to construct an optical lateral connecting that customer.
In the old days, when carriers were the primary customers, matters were simpler. One simply built out to carrier hotels, data centers and key central offices, knowing that most of the high-bandwidth termination demand would be at such locations. That isn’t so easy when the customer base primarily is enterprise customers.
But the allocation of new capital to the access business, if welcome, is not the same thing as deployment of capital to support alternate optical access facilities to the places most businesses are located, which is, simply, in the larger markets.
There is no “silver bullet” in the optical access market; just determined, steady, slow progress in lighting new buildings with at least one fiber cable. To be sure, global carriers very much want to connect large enterprise locations with 1 Gigabit-per-second to 10 Gbps optical connections.
The problem sometimes is that such connections don’t exist, or sometimes simply that sourcing such facilities is laborious because there are so many small providers in local markets. The problem for a global carrier is simply the need to source really high bandwidth access all over the world, easily.
In part, it’s a Layer One issue. In the U.S. market, for example, only 12 percent of business sites have fiber connectivity. Only 20 percent of North American cell sites have fiber connectivity.
That explains the continuing attraction wireless and Ethernet-over-copper alternatives represent. To be sure, programs such as Verizon’s FiOS will solve those problems for consumers, and almost incidentally for many branch office, small office or smaller business executives.
In the second quarter, for example, Cogent Communications added 1,208 on-net connections, up 53.5 percent from the 787 added in the first quarter. In the third quarter Cogent added 30 buildings and expects to have added 100 on-net buildings by the end of the year.
The company expects to do so again in 2008, adding 100 new buildings to its network.
“As of September 30, 2007, we had 1,189 buildings directly connected to the network, representing over 520 million square feet of rentable office space, out of an addressable inventory in North America of about 6.2 billion square feet,” says Dave Schaeffer, Cogent CEO.
“We are currently utilizing a little bit less than 22 percent of the lit capacity in our network,” says Schaeffer, illustrating the issue nicely: fiber isn’t the problem, access to customers with fiber is an issue.
At the end of June Time Warner Telecom had 7,884 buildings connected on its own facilities. At the end of September the company had 8,109 buildings on network, an increase of about 225 buildings, or about three percent. On an annual basis, on-network buildings increased about 19 percent.
RCN has something in excess of 800 buildings on network. Optimum Lightpath has about 2,500 buildings on network with fiber connections.
Nationwide, there are some 95,000 fiber-fed buildings, says GeoResults. And of course, compounding the problem is the fact that lots of the fiber access to lit buildings is in a common cable sheath, no matter who the retailer of record is. For many desirable buildings, the issue is that most of the suppliers actually use fiber in the same cable sheath.
There is progress. It simply is progress of the persistent, gradual sort.
The point is to separate the legitimate financial plays—rolling up and aggregating optical access assets in tertiary markets, such as Zayo Bandwidth is doing, from the operating situation, which continues to be that optical connections to more buildings is the gate.
One would think optical connections to wireless towers are an obvious, slam dunk sort of opportunity. With broadband demands growing rapidly, and locations so easy to identify, replacement of copper-fed T1 or microwave connections, the typical solution these days, would seem to be a fairly easy business proposition.
There are perhaps 2.2 million wireless base station sites globally, including 250,000 in North America alone. Assume half those base stations use wireless backhaul, while the other half use leased T1s or optical connections.
The Chinese market is unusual in the sense that most of China Mobile’s base stations already are fiber connected. Observers tend to note that in Europe, the Middle East or African markets, it wouldn’t be unusual to find that 60 percent of connections use microwave technology while 25 percent use optical connections and just 15 percent or so are based on copper E1 connections.
In the U.S. market, perhaps 10 percent to 20 percent of towers and other transmitting locations use fiber connections, accounting for 25,000 to 50,000 optical backhaul locations. And though microwave backhaul is popular in other markets, it rarely is used in the U.S. market.
That suggests as many as 225,000 wireless tower sites, or as few as 200,000, are fed by T1 connections over copper media. Depending on which carrier is involved, backhaul can represent 20 to 40 percent of recurring operating cost.
Verizon and at&t obviously are in position to use their other assets to slice this cost of doing business, while Sprint Nextel and T-Mobile obviously face higher costs. But the fiber access opportunity isn’t necessarily contingent on replacement of copper-fed T1s with optical replacements.
Indeed, voice works pretty well when the backhaul is based on T1 technology, so carriers might well not want to complicate their operations by moving all that traffic over to optical access. It might in fact make just as much sense, or more sense, to use the optical facilities for the rapidly-growing IP traffic demands, leaving T1 facilities in place for voice.
In other words, use the Time Division Multiplex network for voice traffic that is highly sensitive to latency, and use optical Ethernet for bursty data traffic. Of course, thinking is bound to change once any appreciable amount of usage and revenue is generated by video.
At some point, optical will be the best choice. The issue is when that will happen, and what the optimal choices are in the meantime. The point is that optical Ethernet, though the long-term answer, doesn’t cleanly address all the operational issues carriers think they face.
Encapsulating TDM traffic for Ethernet transmission worries carrier technologists for any number of reasons, for example.
The bottom line is that optical Ethernet, and business optical access, continues to grow every quarter. It just isn’t the sort of transformation that can happen much faster, given the need to balance revenue from the first customer account with the cost to construct an optical lateral connecting that customer.
In the old days, when carriers were the primary customers, matters were simpler. One simply built out to carrier hotels, data centers and key central offices, knowing that most of the high-bandwidth termination demand would be at such locations. That isn’t so easy when the customer base primarily is enterprise customers.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Verizon Wireless Goes Open
In a historic move, Verizon Wireless says it will provide customers the option to use wireless devices, software and applications not offered by the company. Verizon Wireless plans to have this new choice available to customers throughout the country by the end of 2008.
In early 2008, the company will publish the technical standards the development community will need to design products to interface with the Verizon Wireless network. Any device that meets the minimum technical standard will be activated on the network. Devices will be tested and approved in a $20 million state-of-the-art testing lab which received an additional investment this year to gear up for the anticipated new demand. Any application the customer chooses will be allowed on these devices.
“This is a transformation point in the 20-year history of mass market wireless devices, one which we believe will set the table for the next level of innovation and growth,” says Lowell McAdam, Verizon Wireless president and CEO.
That isn't to say Verizon will stop bundling devices, plans and features, as it believes most consumers prefer to buy that way. Still, Verizon is bowing to the inevitable. Open wireless networks are coming.
One has to say that Google already is winning much of what it seeks: an open mobile Internet.
Labels:
open networks,
Verizon Wireless
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
New BlackBerry Consumer Phone
MultiMedia Intelligence projects worldwide unit shipments of multimedia feature-rich mobile phones will exceed 300 million units in 2008, outnumbering shipments of TV sets.
Multimedia phones have at least 1 megapixal image capture, MP3 audio, video playback, Java, USB, Bluetooth, 16-bit screen color, QVGA resolution, WAP and MMS. Revenue from these handsets will be over $76 billion.
Numbers that large are a reason why Research in Motion will be launching new consumer-focused devices in the first quarter next year. The 9000 series is described by Carmi Levy, an analyst at AR Communications Inc. , as "the future of the BlackBerry franchise," a complete breakaway from the device's business roots. Instead, the new series targets the consumer space served by the Pearl and Curve models.
"The 9000 is supposed to be a touch-screen device, very similar in form factor to the iPhone," Levy says. "Which means that it is not an enterprise-friendly device."
The 9000 series will break from the traditional half-screen, half-keyboard look of the BlackBerry. The handsets will also incorporate an upgraded multimedia system, along with the standard push email capabilities.
Levy speculates that RIM will introduce the 9000 series in the first quarter of next year.
Among the updates will be "a Curve with WiFi," according to Levy. These devices may have other updates like GPS location tracking and higher resolution on-board cameras as well.
Labels:
9000 series,
BlackBerry,
feature phone,
Multimedia Intelligence,
Research in Motion,
smart phone
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
GDrive: Cloud Computing
Google apprears to be prepping a storage service that would let users store online essentially all of the files they might keep on their local hard drives, according to reporting by the Wall Street Journal. Users would gain mobility, remote backup and simple Web access to their information from virtually any broadband-connected device.
For Google, getting people to store data online makes it easier to get them to use productivity and other applications online. The possibly unanticipated impact is that enterprise computing architectures might change in this direction as well, as improbable as that may seem.
Cloud-based computing arguably is easier to manage and better adapted for supporting remote, traveling and dispersed workers, which is more the case every day.
Google is trying to let users upload and access files directly from their PC desktops and have the file storage behave for consumers more like another hard drive that is handy at all times, say the people familiar with the matter.
Of course, one limitation of such an Internet-based storage service is offline access.
Google is hoping the new storage service will help tie together some of its other services through a single search box, allowing a single search by keywords to find privately stored files, regardless of whether they're accessed through Picasa, Docs or a software program running on the user's computer.
Google appears to be moving toward being able to "store 100% of user data."
For Google, getting people to store data online makes it easier to get them to use productivity and other applications online. The possibly unanticipated impact is that enterprise computing architectures might change in this direction as well, as improbable as that may seem.
Cloud-based computing arguably is easier to manage and better adapted for supporting remote, traveling and dispersed workers, which is more the case every day.
Google is trying to let users upload and access files directly from their PC desktops and have the file storage behave for consumers more like another hard drive that is handy at all times, say the people familiar with the matter.
Of course, one limitation of such an Internet-based storage service is offline access.
Google is hoping the new storage service will help tie together some of its other services through a single search box, allowing a single search by keywords to find privately stored files, regardless of whether they're accessed through Picasa, Docs or a software program running on the user's computer.
Google appears to be moving toward being able to "store 100% of user data."
Labels:
cloud computing,
enterprise SaaS,
GDrive,
Google,
Web computing
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, November 26, 2007
Test Confirms: Vista is a Slug Compared to XP
Windows XP Service Pack 3, the update scheduled to release next year, runs Microsoft Corp.'s Office suite 10 percent faster than XP SP2, Devil Mountain Software, a performance testing software company says. That's not the biggest news.
According to Devil Mountain, Windows XP SP3 is also considerably faster than Vista SP1. "None of this bodes well for Vista, which is now more than two times slower than the most current builds of its older sibling," company executives say.
According to Devil Mountain, Windows XP SP3 is also considerably faster than Vista SP1. "None of this bodes well for Vista, which is now more than two times slower than the most current builds of its older sibling," company executives say.
Labels:
Vista XP
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Eventually, "Back to the Future" for Lumen Technologies
Eventually, Lumen Technologies will go back to the future, reversing its mashup of focused data transport and enterprise customer base with ...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...