Cable and satellite providers of video entertainment have different financial interests from content providers, even though both are essential parts of the multi-channel video entertainment ecosystem.
Likewise, handets manufacturers, mobile application providers and access providers have distinct financial interests, though all are part of the single mobile ecosystem.
That being the case, conflicts between ecosystem partners are an ever-present reality. The issue is how much cooperation and conflict is possible, and whether enough benefit occurs, despite some conflict.
Google's release of the Nexus One, and its apparent plans to release a Nexus Two and other devices are prime examples. Some observers, including Google's competitors, will note that it is risky for a partner to compete with its other partners in a single ecosystem.
Microsoft of course questions the wisdom of Google's mobile strategy, insisting Google will have trouble attracting and keeping handset partners for its Android operating system now that the company is selling its very own branded devices.
That certainly is the conventional wisdom. But even a valid conventional wisdom can have exceptions. What "most" partners cannot envision, attempt or succeed at is not to say that "all" partners are so limited. Nor are relationships immutable; they can change over time.
Google might be one of the salient exceptions, as is Apple. Several years ago, most telecom executives were more afraid of Google than of cable operators. These days, executives are looking for ways to leverage and work with Google.
Apple has significantly reinvented business frameworks in the music and phone businesses, for example.
The other issue is that Google's relationship with some ecosystem partners can be qutie distinct. At least initially, HTC and Motorola have add a different relationship than other manufacturers, and T-Mobile as a service provider likewise was early to support Android.
Google's other partnerships are a bit more complicated and one has to think Verizon and Motorola are less than thrilled, even though both are key Android partners.
Still, the point is that ecosystem relationships periodically get tested. Content providers and cable and satellite operators are used to the possibility of significant conflict over carriage agreements. Also, at the margin, some distributors also are content owners, while some content owners have been distributors.
Some distributors are part of the equipment supplier segment, as well as distributors. Some equipment suppliers are becoming application providers.
Yes, Google risks some ire by distributing its own branded handset. But ecosystem "messiness" is growing throughout the communications and entertainment ecosystems. And some players can attempt strategies that would be considered suicidal if attempted by less powerful contestants.
There are rules, and exceptions to those rules. Apple and Google might prove to be right or wrong. What is indisputable is that they are different; they can attempt things most other players cannot think about.
Sunday, January 10, 2010
Is Google Crazy, or Simply Unusual?
Labels:
Apple,
business model,
cable,
Google,
satellite,
telco strategy
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
FCC has No Current Authority to Impose Network Neutrality Rules?
The U.S. Federal Appeals Court reviewing whether the Federal Communications Commission currently has authority to create or enforce "network neutrality" rules has not yet ruled.
But initial questioning suggests the court questions whether the Federal Communications Commission has current jurisdiction to write, much less enforce, net-neutrality rules for the Internet. So some legal observers now suggest the appeals court will in fact rule that the FCC had not authority to sanction Comcast for the way it managed peer-to-peer services.
A 2008 FCC order forced Comcast to stop throttling BitTorrent applications as a means of managing network congestion.
U.S. Court of Appeals for the District of Columbia Circuit Jude Raymong Randolph pointed out to an FCC attorney that “you have yet to identify a specific statute.”
Since the Congress has passed no laws relating to network neutrality, the FCC had, and has, no authority to take action on the matter, the judge seems to suggest.
A ruling of that sort would at least temporarily delay any new efforts by the FCC to codify new network neutrality rules, and shift the battle over such rules to the Congress.
FCC Chairman Julius Genachowski has argued the agency has authority to set net neutrality rules because of the "Internet Freedoms Principles" set in 2005, which say that users have the right to use lawful applications, which P2P is, though the use of P2P sometimes includes transfers of copyrighted content without permission.
But Comcast argues it has the right to manage its network, which it interprets as permitting rate limiting of P2P services, when necessary to preserve user experience and relieve congestion.
To be sure, the specific issue at hand seems primarily about whether the FCC’s decision was improper for statutory reasons, as Congress has not given the FCC legislative permission to create such rules, observers say.
On a wider legislative front, some observers think the White House is dialing back its efforts to get "strong" network neutrality rules adopted. The evidence is indirect, but some point to the late-October resignation of of Susan Crawford, University of Michigan law professor, previously a key adviser to the president on technology and communications, and a proponent of "strong" network neutrality rules.
According to the American Spectator, Crawford's version of Net neutrality was too radical for White House economic adviser Lawrence Summers, contributing to her early departure. If that observation is correct, it would be a sign that any new rules would not strictly ban "every" form of packet prioritization.
Many observers note that quality of service measures typically are needed when users want to interact with important video or voice services, especially as video already has become the primary driver of bandwidth consumption on a global level.
Those observers also would note that strict versions of net neutrality, that would absolutely ban any packet prioritization, would prevent Internet access providers from applying prioritization on behalf of their users, even when those users might specifcially ask for, and desire, such prioritization.
"Packet discrimination" sounds bad, and is, when it is used as a business weapon, allowing unfair competition. But packet discrimination is a good thing when it helps maintain quality of experience for the emerging applications users say are important, especially video and voice.
Also, at the recent Consumer Electronics Show, White House deputy CTO Andrew McLaughlin said the FCC had yet to determine whether Net neutrality is needed to preserve the "open Internet."
If that seems unremarkable, consider that in 2009 McLaughlin had said network management practices of cable companies that limited the speeds of large file downloads were essentially the same thing as Chinese-style Internet censorship.
Management of bandwidth-heavy applications by some users at times of network congestion is not application "blocking" or censorship. It is an effort to maintain quality of service for most users. Some methods will be more palatable than others.
The analogy is access to the old voice network. Telcos do not "censor" speech when, at times of peak load, a user might encounter a "fast busy" signal indicating that no circuits are available. The point is that every network gets congested at least some of the time.
And it always has been recognized that some method of regulating access at such times is a legitimate network management matter. In fact, a fast busy tone does mean a user has temporarily been "blocked" from the network. Sometimes a mobile voice call experiences the same sort of temporary blocking.
That sort of access blocking is not any suppression of freedom of communication or expression. It is not an infringement of Internet freedom. It is a simple way of managing a congested resource at times of peak load.
The immediate matter at hand, though, is the simple matter of legislatively-granted authority. The appeals court seems to be signaling its belief that Congress has granted the FCC no authority to impose rules about network congerstion management or methods of doing so.
But initial questioning suggests the court questions whether the Federal Communications Commission has current jurisdiction to write, much less enforce, net-neutrality rules for the Internet. So some legal observers now suggest the appeals court will in fact rule that the FCC had not authority to sanction Comcast for the way it managed peer-to-peer services.
A 2008 FCC order forced Comcast to stop throttling BitTorrent applications as a means of managing network congestion.
U.S. Court of Appeals for the District of Columbia Circuit Jude Raymong Randolph pointed out to an FCC attorney that “you have yet to identify a specific statute.”
Since the Congress has passed no laws relating to network neutrality, the FCC had, and has, no authority to take action on the matter, the judge seems to suggest.
A ruling of that sort would at least temporarily delay any new efforts by the FCC to codify new network neutrality rules, and shift the battle over such rules to the Congress.
FCC Chairman Julius Genachowski has argued the agency has authority to set net neutrality rules because of the "Internet Freedoms Principles" set in 2005, which say that users have the right to use lawful applications, which P2P is, though the use of P2P sometimes includes transfers of copyrighted content without permission.
But Comcast argues it has the right to manage its network, which it interprets as permitting rate limiting of P2P services, when necessary to preserve user experience and relieve congestion.
To be sure, the specific issue at hand seems primarily about whether the FCC’s decision was improper for statutory reasons, as Congress has not given the FCC legislative permission to create such rules, observers say.
On a wider legislative front, some observers think the White House is dialing back its efforts to get "strong" network neutrality rules adopted. The evidence is indirect, but some point to the late-October resignation of of Susan Crawford, University of Michigan law professor, previously a key adviser to the president on technology and communications, and a proponent of "strong" network neutrality rules.
According to the American Spectator, Crawford's version of Net neutrality was too radical for White House economic adviser Lawrence Summers, contributing to her early departure. If that observation is correct, it would be a sign that any new rules would not strictly ban "every" form of packet prioritization.
Many observers note that quality of service measures typically are needed when users want to interact with important video or voice services, especially as video already has become the primary driver of bandwidth consumption on a global level.
Those observers also would note that strict versions of net neutrality, that would absolutely ban any packet prioritization, would prevent Internet access providers from applying prioritization on behalf of their users, even when those users might specifcially ask for, and desire, such prioritization.
"Packet discrimination" sounds bad, and is, when it is used as a business weapon, allowing unfair competition. But packet discrimination is a good thing when it helps maintain quality of experience for the emerging applications users say are important, especially video and voice.
Also, at the recent Consumer Electronics Show, White House deputy CTO Andrew McLaughlin said the FCC had yet to determine whether Net neutrality is needed to preserve the "open Internet."
If that seems unremarkable, consider that in 2009 McLaughlin had said network management practices of cable companies that limited the speeds of large file downloads were essentially the same thing as Chinese-style Internet censorship.
Management of bandwidth-heavy applications by some users at times of network congestion is not application "blocking" or censorship. It is an effort to maintain quality of service for most users. Some methods will be more palatable than others.
The analogy is access to the old voice network. Telcos do not "censor" speech when, at times of peak load, a user might encounter a "fast busy" signal indicating that no circuits are available. The point is that every network gets congested at least some of the time.
And it always has been recognized that some method of regulating access at such times is a legitimate network management matter. In fact, a fast busy tone does mean a user has temporarily been "blocked" from the network. Sometimes a mobile voice call experiences the same sort of temporary blocking.
That sort of access blocking is not any suppression of freedom of communication or expression. It is not an infringement of Internet freedom. It is a simple way of managing a congested resource at times of peak load.
The immediate matter at hand, though, is the simple matter of legislatively-granted authority. The appeals court seems to be signaling its belief that Congress has granted the FCC no authority to impose rules about network congerstion management or methods of doing so.
Labels:
comcast,
FCC,
network neutrality,
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Saturday, January 9, 2010
Android: "Excessive Choice" Danger?
Android developments continue fast and furious. AT&T says it will launch five separate Android devices by June. Only days after announcing its Nexus One, Google says it will introduce a "Nexus Two" better optimized for business users.
The Verizon Droid, launched in November, seems to have abruptly changed Android's position in the mobile browsing market in under two months, and dramatically increased the general level of interest in Android devices overall.
In just two months, Android has emerged as the second most popular platform used to access InformationWeek’s mobile web site, pushing aside BlackBerry and taking a meaningful bite out of Apple’s iPhone share of traffic, says Tom Smith, TechWeb VP.
In November 2009, Android accounted for eight percent of mobile page views at TechWeb, compared to 59 percent for Apple and 17 percent for Blackberry, says Smith.
In December, though, Android did far better. Apple had 51 percent share; Android 24 percent; Blackberry eight percent, he says.
Google, which just released the the Nexus One phone, now says the Nexus Two will have a physical keyboard and will be more suitable for enterprise users (obligatory boilerplate: "Nexus One" is aimed at the Apple iPhone; Nexus Two will challenge the BlackBerry").
Android enthusiasts will be pleased by the explosion of activity, right? Well, yes and no. The whirlwind of activity could have an opposite effect: either freezing potential buyers into inaction as they wait for the next device, the next offer, the next set of business arrangements and carriers.
Social psychologists Sheena Iyengar, PhD, a management professor at Columbia University Business School, and Mark Lepper, PhD, a psychology professor at Stanford University, have demonstrated the downside of "excessive" choice.
In a 2000 paper the researchers showed that when shoppers are given the option of choosing among smaller and larger assortments of jam, they show more interest in the larger assortment.
But when it comes time to pick just one, they're 10 times more likely to make a purchase if they choose among six rather than among 24 flavors of jam.
In a separate study, Iyengar and Wei Jiang, PhD, a finance professor at Columbia Business School, analyzed retirement-fund choices, ranging from packages of two to 59 choices, among some 800,000 employees at 647 companies.
Instead of leading to more thoughtful choosing, however, more options led people to act like the jam buyers: When given two choices, 75 percent participated, but when given 59 choices, only 60 percent did. In addition, the greater the number of options, the more cautious people were with their investment strategies, the team found.
Relatedly, too much choice also can lead people to make simple, snap judgments just to avoid the hassle of wading through confusing options, which ironically can sabotage a company's marketing plan, finds social psychologist Alexander Chernev, PhD, of Northwestern University's Kellogg School of Management.
Chernev found that when people were offered variants of the same brand of toothpaste, cavity-prevention, tartar-control and teeth-whitening types, for instance, they tended to switch to another brand that offered a single option.
So what's the problem? "The customer has no idea how to decide and may therefore switch to another brand that doesn't require making tradeoffs," Chernev says.
In that case, they often choose what Herbert Simon, PhD, first referred to as a "satisficing" option: people make the first reasonable choice that fits their preferences, but not the "absolute best" solution.
In other words, instead of exhaustively scanning all options until finding the perfect, or "maximizing" choice, people simply make the "it's okay" choice, not working through all the possible angles.
The implication for Android buyers? Study the options, then settle on something you feel good, if not perfectly, about. Trying to buy the "absolute best" device will create anxiety and buyer's remorse at some point as the next device option is made available, the price of older options plummets or terms of service and carrier choices evolve.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Friday, January 8, 2010
Android Bumps BlackBerry Traffic in December, TechWeb Says
In just two months, Android has emerged as the second most popular platform used to access InformationWeek’s mobile web site, pushing aside BlackBerry and taking a meaningful bite out of Apple’s iPhone share of traffic, says Tom Smith, TechWeb VP.
In November 2009, Android accounted for eight percent of mobile page views at TechWeb, compared to 59 percent for Apple and 17 percent for Blackberry, says Smith.
In December, though, Android did far better. Apple had 51 percent share; Android 24 percent; Blackberry eight percent, he says.
"To varying degrees, the trends are holding up across other sites in our network as well, but those sites don’t have the same level of visitor activity as mobile.informationweek.com so the numbers above are the strongest indicator we have of Droid’s impact," says Smith.
Smith thinks it was the Verizon launch of the Droid that caused the surge in mobile activity. "We saw a spike in usage of our mobile sites in December, when Droid activity truly took off," he says.
Android appears to be making what had been a two-horse race in smartphones into a three-horse contest, with the previous number two, Research in Motion, being pushed back to third place.
Though impressionistic, the data is in line with what other recent studies suggest, namely that the Android operating system hit some sort of inflection point in December 2009.
In November 2009, Android accounted for eight percent of mobile page views at TechWeb, compared to 59 percent for Apple and 17 percent for Blackberry, says Smith.
In December, though, Android did far better. Apple had 51 percent share; Android 24 percent; Blackberry eight percent, he says.
"To varying degrees, the trends are holding up across other sites in our network as well, but those sites don’t have the same level of visitor activity as mobile.informationweek.com so the numbers above are the strongest indicator we have of Droid’s impact," says Smith.
Smith thinks it was the Verizon launch of the Droid that caused the surge in mobile activity. "We saw a spike in usage of our mobile sites in December, when Droid activity truly took off," he says.
Android appears to be making what had been a two-horse race in smartphones into a three-horse contest, with the previous number two, Research in Motion, being pushed back to third place.
Though impressionistic, the data is in line with what other recent studies suggest, namely that the Android operating system hit some sort of inflection point in December 2009.
Labels:
Android,
Apple,
BlackBerry,
iPhone
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Mobile Browsing Still Just 1.3% of All Browsing, But Growing Fast
The Net Applications statistics confirm that most users continue to do most of their Web browsing on PCs, but also that mobile's share has steadily increased during 2009.
Both Windows and Mac devices lost a small amount of share in December, as Android began to make its presence felt, but all major mobile operating systems posted large percentage gains. Android grew 54.8 percent, while BlackBerry grew 22.2 percent. The Apple iPhone posted a 19-percent gain while Java ME grew 15.4 percent.
While the iPhone continues to account for the largest share of mobile Web browsing, Google's Android mobile operating system was by far the largest percentage gainer in December 2009, accounting for 0.05 percent of all Web browsing, up from 0.01 percent in February.
It does appear an inflection point has been reached, however: the adoption curve appears to be steepening.
Labels:
mobile browser,
mobile Internet,
mobile Web
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Thursday, January 7, 2010
Sprint and Skiff to Sell E-Reader
Though firm pricing and availability are not yet announced, Sprint will be providing the connectivity services for the new Skiff e-reader, to be sold sometime this year.
The Skiff Reader e-reader uses a metal foil display service, not glass.
Sprint and Skiff will also launch a Skiff Store, where users will be able to find more digital content.
Touted as the “first e-reader optimized for newspaper and magazine content”, as well as the first to use LG Display’s “metal foil” e-paper technology, the Skiff Reader will use Sprint’s 3G network and also can use a Wi-Fi connection.
The Skiff Reader also features Wi-Fi, a 11.5 inch, 1200 x 1600 pixels touchscreen display, built-in speaker, 3.5mm headset jack, and USB 2.0.
Books, magazines, newspapers, personal and work documents, and other types of digital content can be stored on the Skiff Reader thanks to its 4GB internal memory (expandable with a MicroSD card).
The Skiff Reader, the initial dedicated device to integrate the upcoming Skiff e-reading service, is about a quarter-inch in overall height and clearly is the thinnest e-reader yet produced by any supplier.
The device uses a full touch-screen and weighs just over one pound.
The Skiff Reader's flexibility is based on its construction from a thin, flexible sheet of stainless-steel foil, not glass.
The Skiff Reader e-reader uses a metal foil display service, not glass.
Sprint and Skiff will also launch a Skiff Store, where users will be able to find more digital content.
Touted as the “first e-reader optimized for newspaper and magazine content”, as well as the first to use LG Display’s “metal foil” e-paper technology, the Skiff Reader will use Sprint’s 3G network and also can use a Wi-Fi connection.
The Skiff Reader also features Wi-Fi, a 11.5 inch, 1200 x 1600 pixels touchscreen display, built-in speaker, 3.5mm headset jack, and USB 2.0.
Books, magazines, newspapers, personal and work documents, and other types of digital content can be stored on the Skiff Reader thanks to its 4GB internal memory (expandable with a MicroSD card).
The Skiff Reader, the initial dedicated device to integrate the upcoming Skiff e-reading service, is about a quarter-inch in overall height and clearly is the thinnest e-reader yet produced by any supplier.
The device uses a full touch-screen and weighs just over one pound.
The Skiff Reader's flexibility is based on its construction from a thin, flexible sheet of stainless-steel foil, not glass.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Wednesday, January 6, 2010
More Regulation Needed to Spur Broadband Competition? Really?
The U.S. Federal Communications Commission should consider regulations for broadband providers in an effort to increase competition, says Lawrence Strickling, National Telecommunications and Information Administrationassistant secretary, as reported by IDG News Service.
"We urge the Commission to examine what in many areas of the country is at best a duopoly market and to consider what, if any, level of regulation may be appropriate to govern the behavior of duopolists," Strickling says.
With all due respect for Strickling, who is a smart, experienced regulatory type who knows the terrain, and without disagreeing in full with the full content of his filing on behalf of NTIA, the notion that competition somehow is so stunted that new regulatiions are required likely would lead to greater harm, despite its good intentions.
Here's the argument. Consider, if you will, any large industry with critical implications for the entire U.S. economy. Now consider the following mandate: "you will be forced to replace 50 percent of your entire revenue in 10 years."
"During that time, for a variety of reasons, incumbents will be forced to surrender significant market share to competitors, so that in addition to replacing half of the industry's revenue, it also will have to do so with dramatically fewer customers."
"After that, in another decade, the industry will be required to replace, again, another 50 percent of its revenue. All together, the industry will required to relinquish at least 30 percent of its market share, in some cases as much as half, and also will be required to replace nearly 100 percent of its revenue, including the main drivers of its profitability."
Does that sound like the sort of industry that desperately needs additional competition? Really?
Nor is the argument theoretical. Over a 10-year period between 1997 and 2007, the U.S. telephone industry was so beset with new technology and competition that almost precisly half of its revenue (long distance), the revenue driver that provided nearly all its actual profit, was lost.
The good news is that the revenue was replaced by wireless voice. Then, because of the Internet, cable company entry into voice and the Telecommunications Act of 1996, market share began to wither. That, after all, is the point of deregulation: incumbents are supposed to lose market share to competitors.
Now we have the second decade's project, when mobile voice revenues similarly will have to be replaced, in turn, as IP-based voice undermines the high-margin voice services that have been the mainstay of the mobile business.
If you follow the telecom industry as a financial matter, you know that service providers have maintained their profitability only partly by growing topline revenues. They also have been downsizing workforces and slashing operating costs.
If you talk to ex-employees of the telecom industry, they will tell you the industry seems no longer to be a "growth" industry. That's why millions of people who used to work in telecom no longer do so.
So what about the other big incumbent industry, cable TV operators. As you clearly can see, and can read about nearly every day, there are huge questions about the future business model for what used to be known as "cable TV." Many observers already predict that such services will move to Internet delivery, weakening or destroying the profitability of the U.S. cable industry.
Industry executives, no dummies they, already have moved into consumer voice and data communications, and now are ramping up their assault on business communications. Why? They are going in reverse for the core video business.
Imposing regulatory burdens on incumbents--either telco or cable--that are losing their core revenue drivers on such a scale might not be wise. Few industries would survive back-to-back decades where the core revenue drivers must be replaced by "something else."
Imagine the U.S. Treasury being asked to replace virtually 100 percent of its revenue with "something else" in about 20 years. Imagine virtually any other industry being asked to do the same.
The point is that industries asked to confront such challenges and surmount them are not typically the sort of industries that need to have additional serious obstacles placed in their way.
Granted, they are niche suppliers, but Strickling also is well aware there are two satellite broadband providers battling for customers, plus five mobile broadband providers, and then hundreds of independent providers providing terrestrial fixed wireless access or packaging wholesale capacity to provide retail services.
Granted, only cable, satellite, telcos and several mobile providers have anything like ubiquitous footprints, but that is a function of the capital intensity of the business. Most markets will not support more than several suppliers in either fixed or wireless segments of the business.
One can argue there is not more facilities-based competition because regulation is inadequate, or one can argue investment capital no longer can be raised to build a third ubiquitous wired network.
The point is that wired network scarcity might be a functional of rational assessments of likely payback. Cable TV franchises are not a monopoly in any U.S. community. But only rarely have third providers other than the cable TV or incumbent phone companies attempted to build city-wide third networks. Regulatory barriers are not the issue: capital and business potential are the problems.
Also I would grant that mobile broadband is not a full product substitute for fixed broadband. But where we might be in five to 10 years cannot yet be ascertained. And we certainly do not want to make the same mistake we made last time.
The Telecommunications Act of 1996, the first major revamping of U.S. telecom regulation since 1934, was supposed to shake up the sleepy phone business. But the Telecom Act of 1996 occurred just as landline voice was fading, and the Internet was rising.
If you wonder why virtually every human being with a long enough memory would say their access to applications, services, features and reasonable prices is much better now than before the Telecom Act of 1996, even assuming it has completely failed, the answer is that the technology and the market moved too fast for regulators to keep up.
The Telecom Act tried to remedy a problem that fast is becoming irrelevant: namely competition for voice services. In fact, voice services rapidly are becoming largely irrelevant, or marginal, as the key revenue drivers for most providers in the business.
Yes, there are only a few ubiquitous wired or wireless networks able to provider broadband. But that might be a function of the capital required to build such networks, the nature of payback in a fiercely-competitive market and a shift of potential revenue away from "network access" suppliers and towards application providers.
It always sounds good to call for more competition. Sometimes it even is the right thing to do. But there are other times when markets actually cannot support much more competition than already exists. Two to three fixed broadband networks in a market, plus two satellite broadband providers, plus four to five mobile providers, plus many smaller fixed wireless or reseller providers does not sound much like a "market" that needs to stimulate more competition.
There's another line of reasoning one might take, but would make for a very-long post. That argument would be that, judged simply on its own merits, the availability and quality of broadband services, in a continent-sized country such as the United States, with its varigated population density, is about what one would expect.
Even proponents of better broadband service in the United States are beginning to recognize that "availability" is not the problem: "demand" for the product is the key issue.
"We urge the Commission to examine what in many areas of the country is at best a duopoly market and to consider what, if any, level of regulation may be appropriate to govern the behavior of duopolists," Strickling says.
With all due respect for Strickling, who is a smart, experienced regulatory type who knows the terrain, and without disagreeing in full with the full content of his filing on behalf of NTIA, the notion that competition somehow is so stunted that new regulatiions are required likely would lead to greater harm, despite its good intentions.
Here's the argument. Consider, if you will, any large industry with critical implications for the entire U.S. economy. Now consider the following mandate: "you will be forced to replace 50 percent of your entire revenue in 10 years."
"During that time, for a variety of reasons, incumbents will be forced to surrender significant market share to competitors, so that in addition to replacing half of the industry's revenue, it also will have to do so with dramatically fewer customers."
"After that, in another decade, the industry will be required to replace, again, another 50 percent of its revenue. All together, the industry will required to relinquish at least 30 percent of its market share, in some cases as much as half, and also will be required to replace nearly 100 percent of its revenue, including the main drivers of its profitability."
Does that sound like the sort of industry that desperately needs additional competition? Really?
Nor is the argument theoretical. Over a 10-year period between 1997 and 2007, the U.S. telephone industry was so beset with new technology and competition that almost precisly half of its revenue (long distance), the revenue driver that provided nearly all its actual profit, was lost.
The good news is that the revenue was replaced by wireless voice. Then, because of the Internet, cable company entry into voice and the Telecommunications Act of 1996, market share began to wither. That, after all, is the point of deregulation: incumbents are supposed to lose market share to competitors.
Now we have the second decade's project, when mobile voice revenues similarly will have to be replaced, in turn, as IP-based voice undermines the high-margin voice services that have been the mainstay of the mobile business.
If you follow the telecom industry as a financial matter, you know that service providers have maintained their profitability only partly by growing topline revenues. They also have been downsizing workforces and slashing operating costs.
If you talk to ex-employees of the telecom industry, they will tell you the industry seems no longer to be a "growth" industry. That's why millions of people who used to work in telecom no longer do so.
So what about the other big incumbent industry, cable TV operators. As you clearly can see, and can read about nearly every day, there are huge questions about the future business model for what used to be known as "cable TV." Many observers already predict that such services will move to Internet delivery, weakening or destroying the profitability of the U.S. cable industry.
Industry executives, no dummies they, already have moved into consumer voice and data communications, and now are ramping up their assault on business communications. Why? They are going in reverse for the core video business.
Imposing regulatory burdens on incumbents--either telco or cable--that are losing their core revenue drivers on such a scale might not be wise. Few industries would survive back-to-back decades where the core revenue drivers must be replaced by "something else."
Imagine the U.S. Treasury being asked to replace virtually 100 percent of its revenue with "something else" in about 20 years. Imagine virtually any other industry being asked to do the same.
The point is that industries asked to confront such challenges and surmount them are not typically the sort of industries that need to have additional serious obstacles placed in their way.
Granted, they are niche suppliers, but Strickling also is well aware there are two satellite broadband providers battling for customers, plus five mobile broadband providers, and then hundreds of independent providers providing terrestrial fixed wireless access or packaging wholesale capacity to provide retail services.
Granted, only cable, satellite, telcos and several mobile providers have anything like ubiquitous footprints, but that is a function of the capital intensity of the business. Most markets will not support more than several suppliers in either fixed or wireless segments of the business.
One can argue there is not more facilities-based competition because regulation is inadequate, or one can argue investment capital no longer can be raised to build a third ubiquitous wired network.
The point is that wired network scarcity might be a functional of rational assessments of likely payback. Cable TV franchises are not a monopoly in any U.S. community. But only rarely have third providers other than the cable TV or incumbent phone companies attempted to build city-wide third networks. Regulatory barriers are not the issue: capital and business potential are the problems.
Also I would grant that mobile broadband is not a full product substitute for fixed broadband. But where we might be in five to 10 years cannot yet be ascertained. And we certainly do not want to make the same mistake we made last time.
The Telecommunications Act of 1996, the first major revamping of U.S. telecom regulation since 1934, was supposed to shake up the sleepy phone business. But the Telecom Act of 1996 occurred just as landline voice was fading, and the Internet was rising.
If you wonder why virtually every human being with a long enough memory would say their access to applications, services, features and reasonable prices is much better now than before the Telecom Act of 1996, even assuming it has completely failed, the answer is that the technology and the market moved too fast for regulators to keep up.
The Telecom Act tried to remedy a problem that fast is becoming irrelevant: namely competition for voice services. In fact, voice services rapidly are becoming largely irrelevant, or marginal, as the key revenue drivers for most providers in the business.
Yes, there are only a few ubiquitous wired or wireless networks able to provider broadband. But that might be a function of the capital required to build such networks, the nature of payback in a fiercely-competitive market and a shift of potential revenue away from "network access" suppliers and towards application providers.
It always sounds good to call for more competition. Sometimes it even is the right thing to do. But there are other times when markets actually cannot support much more competition than already exists. Two to three fixed broadband networks in a market, plus two satellite broadband providers, plus four to five mobile providers, plus many smaller fixed wireless or reseller providers does not sound much like a "market" that needs to stimulate more competition.
There's another line of reasoning one might take, but would make for a very-long post. That argument would be that, judged simply on its own merits, the availability and quality of broadband services, in a continent-sized country such as the United States, with its varigated population density, is about what one would expect.
Even proponents of better broadband service in the United States are beginning to recognize that "availability" is not the problem: "demand" for the product is the key issue.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"
Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...