Tuesday, October 1, 2013

U.S. Mobile Service Provider Market Share Shifts after AT&T, T-Mobile US Acquisitions

The recent spate of mobile service provider acquisitions, in particular on the part of AT&T and T-Mobile US, have rearranged U.S. service provider market share, as measured by subscriber accounts.

Until the acquisition by T-Mobile US of MetroPCS, and AT&T’s purchase of Leap Wireless, Verizon held the lead in share of customer accounts.

Virtually all observers would have in 2012 agreed that Verizon Wireless had the largest number of U.S. mobile subscribers, with AT&T following. Sprint, by most estimates based on company reporting also had significantly more share than T-Mobile US.

But after the acquisitions, market share has shifted, with AT&T leading Verizon, and Sprint still ahead of T-Mobile US, but by a smaller margin.

Many compilations of market share from 2012 had Verizon leading with about 34 percent share, followed by AT&T with about 32 percent, then Sprint with about 17 percent and T-Mobile US with 10 percent share. After a spate of acquisitions, the shares are rearranged.

AT&T now seems to have passed Verizon, and T-Mobile US has gained market share.

2013 mobile service provider market share



Monday, September 30, 2013

A New Era of Computing and Communications: A Decade in the Making

Sometimes big changes occur so gradually we aren’t aware the changes are significant. Back in 2008, for example, IDC predicted the information technology industry was at the very
beginning of what IDC called a "hyper-disruption” of the business, something that happens "once every 20 years to 25 years."

Simply, IDC predicted, rightly, a shift to computing built on mobile devices and apps,
cloud services, mobile broadband networks, big data analytics, and social technologies. That would supersede the current era, which many of us would have a hard time naming, though it is clear we had eras lead by mainframe, minicomputer and then PC devices.

More recently, the Internet has been a bigger factor, and mobile has clearly emerged as an important device form factor.

Collectively, IDC refers to this next era of computing as the "third platform." By about 2020,
when the information technology industry generates $5 trillion in spending, over $1.3 trillion
more than it does today, 40 percent of the industry's revenue and 98 percent of its
growth will be driven by third platform technologies that today represent just 22 percent of
spending.

It would be fair enough to note that such changes in computing eras had direct consequences for suppliers and buyers, notably the inability of any leader in one era to similarly dominate the next era, changes in sales channels (direct to indirect), end user devices, indoor networking and access networks and protocols.

Mobile and cloud are perhaps the easiest ways to characterize the direction of the shift. By extension, such periods of disruption also are going to disrupt revenue models and magnitudes for current suppliers.

A rational observer might agree that “consumerization” of information technology (people using consumer devices and apps) is characteristic of such a platform, as cloud services can be acquired and used by consumers (and also in their work roles) without distribution layers between end user and supplier.

Those distribution layers are important because they include significant businesses--telcos, cable companies, satellite providers, mobile service providers, ISPs, channel partners, distributors, value-added resellers and device suppliers.




Video Makes "Pricing by Value" Difficult

Video entertainment is going to pose a huge challenge for ISPs using every type of business model, from simple “best effort” access to providers of managed services. Sheer volume is the problem for providers of best effort access; but revenue per bit is the issue for some managed service providers.

The revenue per bit problem is easy to describe. Assume an ISP sells a triple-play package for a $130 a month retail price, where each component--voice, Internet access and entertainment video--is priced equally (an implied price of $43 for each component).

Ignore other cost of service elements, such as marketing and content acquisition fees. In term of network usage, that would make sense if each constituent service “consumed” roughly equivalent amounts of capacity, or if retail charging was based relatively directly on consumed bandwidth, and not “perceived value.”

Of course, retail pricing is to some extent based on perceived value. Any service provider would have trouble pricing a service wildly out of line with prevailing customer expectations. A voice service costing about $40 to $50 a month is viewed as a market level.

Internet access priced roughly in the same rangle is viewed as a reasonable, market-set level, as is video service of roughly $70 a month. In other words, people have expectations about what a certain product or service “should” cost, as Apple iTunes has created an expectation that the “right price” for one song is 99 cents.

Use of network resources is unbalanced, though. Voice requires use of almost no bandwidth, while video consumers nearly two orders of magnitude more capacity, for each minute of use. Internet traffic is in between, with some apps consuming little capacity (email), some apps consuming a moderate amount of capacity (web browsing) while others are heavy capacity consumers (video).

So video poses the big value-price issues for an ISP, exacerbated by the revenue and business model implications of third party, over the top video (Netflix, YouTube) and managed video (cable, satellite and telco TV).

The revenue per bit from a customer’s use of Netflix or YouTube is very low. The revenue from a managed video subscription service are arguably reasonable, so long as the delivery network is using multicast technology.

If any future shift to primary delivery of subscription video services using a standard Internet connection, the revenue issues will be compounded, since consumption of video bits will dominate total use of the network. And that will make “pricing by value,” or “pricing by consumption,” more difficult.

The reason is simply that consumers will "value" an hour or two hours of entertainment at levels that make "pricing by value" or "pricing by consumption" a difficult exercise.



Provo, Utah Residents Get Google Fiber 1-Gbps Service in October 2013



The point here is not so much that one U.S. community gets a symmetrical gigabit access service for $70 a month. The broader impliction is that Google Fiber is resetting retail price expectations for gigabit access services. 

That, in turn, is going to eventually reset consumer expectations for access services of lesser bandwidth as well. 

EPB, the Chattanooga, Tenn. supplier of 1-Gbps service, has dropped its gigabit service rate from $300 a month to $70 a month, a reaction to the price umbrella Google Fiber apparently is creating.

EPB also converted all existing customers with 100 and 250 megabit-per-second services to the the gigabit speed.

Separately, Utopia, which operates a wholesale gigabit network in about 10 Utah cities, also says its retail ISP partners have dropped prices for gigabit access from about $300 a month to $65 a month to $85 a month.

To the extent that Google Fiber aimed to reset expectations about access speeds and prices, Google Fiber seems to be succeeding.

Exposing Network Features to Create Revenue is Hard, AT&T Seems to Find

AT&T Adworks is one example of how a service provider can monetize what it knows about its customers, providing insight to third party business partners. Virtually nobody thinks that is an easy new business to create. 

And AT&T might be shifting attention elsewhere (connected car and other machine to machine initiatives, for example), as rumors of job cuts at the business unit are heard.

That doesn't mean mobile service providers will not achieve success at some point, and to some extent. But  AT&T's present efforts to make available data allowing advertisers to target users on mobile, TV and other devices, is not getting traction. 

To be sure, few initiatives of this type--exposing network features to third party business partners--are getting too much traction, though there was much hope five years ago. 

To be sure, we are early in the potential development of this hoped-for trend. But service providers have relatively little to show for the effort, so far. 

On the other hand, app providers might be getting more traction with service providers, as shown either by co-marketing of over the top messaging apps or Virgin Media's cooperation with Netflix. 


Competing with "Free" Remains an Issue

How to ”compete with free" is a major question in the Internet era, where many goods--especially of the non-tangible sort-- can be replicated and produced with low marginal cost.


For communications service providers, the issue has arisen mostly in conjunction with low cost or “free” services such as Skype or WhatsApp that supply voice or messaging services “at no incremental cost,” once a user has suitable devices and Internet access.


That has lead many to say the economics of abundance makes new revenue models possible. Some would say “abundance,” a relative term, makes new models essential, in at least some cases. But the implications are startling.


The basic idea is that transistors, storage, computation and bandwidth are so abundant the cost of their use is a price very close to zero. The corollary is that businesses based on the use of such resources can be viewed differently from businesses where inputs are expensive.


In other words, businesses based on abundant inputs can "waste" those resources. In a pre-broadband, pre-Internet-Protocol era, unicasting (content on demand) would have been nearly impossible.


With fast broadband access and abundance of terminals (smart phones, tablets, PCs, game consoles, specialized decoders and TVs), unicasting is feasible.


With that infrastructure in place, real-time cloud storage and computing is possible to a degree that would have been impossible just a decade ago. Likewise, streamed Netflix content, Pandora and Google Drive (cloud based productivity apps and storage) would not have been a reasonable experience for most people.


The degree of “abundance” in various parts of the Internet ecosystem is open to question, but all theories of abundance require a “not scarce” approach to value and retail pricing. That doesn’t necessarily mean “no cost,” only costs so low they are not a barrier to use of a product.


In some ways, a shift to 1-Gbps fixed network Internet access services, priced at $70 a month, is one example how “abundance” can drive even deployment of capital-intensive Internet access services.


ISPs would not claim it is “nearly free” to build and operate such networks, nor would most consumers consider $70 a month “almost free.” By historical measures, a symmetrical gigabit consumer connection costing just $70 a month is quite “abundant.”


Such logic defies conventional  sense, and certainly conventional economics, which is fundamentally based on assumptions of scarcity. In fact, such retail prices, for those capabilities, virtually require new thinking about additional revenue sources not contingent on charging for actual use of bandwidth.
In part, that explains the intense amount of attention many service providers are paying to machine to machine services, the Internet of Things, mobile payments, connected car and other initiatives.


Those efforts directly reflect the reality that many goods in an Internet era actually seem to defy the logic of scarcity, which tends to underpin retail prices suppliers can command.


That doesn’t necessarily mean a product or experience has “no cost,” only that once created, the marginal cost of distributing one more unit is fairly low. Nor, in truth, do the economics of virtual goods imply there is no scarcity: in fact, “hit” experiences and products still remain scarce.


Still, some talk about the economics of “abundance,” characterizing the business context for some potential products--especially digital goods--as representing a new type of context, where incremental costs are so low that “free or nearly free” is a reasonable price point.


In fact, the “freemium” revenue model, where a base product is usable for no incremental charge, while additional features are available for fee, is precisely an expression of the notion of “abundance” or low incremental marginal distribution costs.


It is true that marginal distribution costs for many digital goods are relatively low, even when initial production costs can be large. A movie or a broadband access network can represent huge amounts of capital investment, but the cost of using a marginal or additional unit can be quite low.


For a movie, marginal distribution cost arguably is lowest when a title is available to be streamed.


For a communications service provider, marginal distribution cost is lowest when enough customer adoption has occurred to recover all sunk costs, but before saturation of network capabilities.


In other words, when a service provider has extra capacity on the network, but already is recovering capital investment costs, the marginal cost of delivering one more phone call, one more text message, one more email or song is quite low.


At about the point where network utilization climbs up to about 90 percent, the economics change, as additional investment costs loom. So sunk costs dominate. Usage-related costs are relatively minimal.


Since nothing is ever truly “free,” the issue is how to construct a revenue and business model around a product that end users can avail themselves of “without incremental charge.”


The answer continues to evolve, but up to this point indirect revenue models, such as advertising, donations, transaction fees and commerce, have provided the revenue model, even when use of a product requires no incremental charge or payment.


Those are difficult revenue models for most communications service providers,in large part because of more than a century of operating as utilities. That experience has shaped both buyer and seller expectations.


As with electrical, natural gas or water services, people are accustomed to buying an admittedly essential service that nevertheless is sells a product (electrons, BTUs or gallons) viewed as a commodity.


So the key, many would argue, is changing the experience of the products sold in ways that add value and differentiate the products. As always for an intangible product such as a “service,” there are tangible elements, but the product itself (“the experience”) is not directly something a buyer can touch and feel.


To be sure, fixed network service providers, though always faced with high sunk costs, might arguably have an advantage in the “bandwidth supply” area that spectrum-using mobile and untethered service providers do not enjoy.


Though the sunk costs are substantial, once a fiber to home network is in place, the incremental cost of activating or enabling bandwidth is relatively low, up to a point.

So, in many ways, “abundant” computing, storage and bandwidth resources create new problems and opportunities for ISPs and communication service providers.

Does Wireless Charging Cause RF Interference?

USB-based device chargers can create noise that interferes with touchscreen operation especially when the chargers omit noise suppression features. So with the advent of wireless charging, one wonders whether noise will be added to the communication channels used by Wi-Fi or Bluetooth devices.

It appears that such corded charging creates noise in the 100 kHz to 1 MHz range, and should therefore not cause problems with Wi-Fi or Bluetooth devices.

But what about wireless charging? Granted, charging systems work by creating localized magnetic fields, which should not, in principle, interfere with radio frequency signals. But three major approaches to wireless charging (radio charging, inductive charging and resonance charging) do use radio frequencies.

Radio charging, intended to reach low-power devices operating within a 10-meter (30 feet) radius from the transmitter, is seen as a way to recharge batteries in medical implants, hearing aids, watches and entertainment devices.

The transmitter sends a low-power radio wave at a frequency of 915 MHz (frequency for microwave ovens) and the receiver converts the signal to energy. The radio charging method is closest to a regular radio transmitter.

But more common are wireless chargers using inductive charging featuring a transmit and receive coil in close proximity. Electric toothbrushes were one of the first devices to use this charging method, and mobile phones are the largest growing sector to charge without wires.

For larger batteries such as electric vehicles, resonance charging, or electro dynamic induction, is being developed, and at least some of those methods use the 915 MHz frequency.

When a new source is “radio” based, there is potential for signal interference. So far, nobody seems to think there will be a problem with magnetic fields and energy in the radio frequency bands.

Of the several systems being developed, including auto charging, the Qi (Wireless Power Consortium), Power Matters Alliance (PMA) and the Alliance for Wireless Power (A4WP) systems, all use localized magnetic fields to transfer energy.

The Cota charging system uses the same spectrum as Wi-Fi and Bluetooth. That naturally raises the question of whether adding a new electromagnetic source can interfere with radio frequency communications, perhaps by adding more noise into a channel.

Wireless charging uses a magnetic field to transfer energy from an alternating current source, using a localized magnetic field, to a device able to convert the magnetic energy into direct current. Of course, the process is inefficient. The only issue is how inefficient.

The point is that wireless charging is convenient, just not “green.”

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...