Monday, February 20, 2012

Why LightSquared Failed

Inevitably, despite the small possibility of some positive resolution, we now will see a period of reflection where observers try to explain "why LightSquared failed." That doesn't mean LightSquared has given up. But as some of us have been saying for a while, the big problem here is interference.

When the frequencies were originally awarded for mobile satellite use, what became the LightSquared spectrum was a "low-power" application, in terms of the transmitted downlink signals.

Mobile communications service is, by way of contrast, a "high-power application." And since all radio communications (digital or analog) is fundamentally a matter of signal-to-noise ratio, there are some physical locations (close to proposed cell sites) where the signal strength of the cell towers simply overpowers the received GPS signal

This is physics, not politics. As originally designed, the satellite-based GPS network and the satellite-based mobile communications network could have co-existed, without interference, because both were low-power systems.

LightSquared has tried to paint the objections as a matter of politics and vested business interests. Those interests do exist. So one explanation for LightSquared's almost-certain failure (assuming one believes there still is a real possibility of fixing the interference issue)  already can be sketched out.

 "Entrenched and vested interests," including the GPS industry and some mobile telecom providers,  were able to defeat LightSquared by political and financial assets brought to bear on the spectrum re-authorization process.

Others would note that the aviation industry and U.S. military also objected, though. No FCC commissioner is going to risk "an airliner falling out of the sky," or other risks to passenger safety. 

LightSquared needed an FCC waiver because it was trying to use spectrum allocated for low-power space-to-ground transmissions for high-power ground-only transmissions. Interference issues with adjacent low-power satellite apps are well understood, which is why two adjacent satellite bands originally were authorized.  Why LightSquared failed


Sunday, February 19, 2012

Big Change Coming for Mobile Payments in 2012, 2013


Enthusiasm about near field communications will be more muted in 2012 and most likely 2013, as mobile payments attention shifts to other ways to enable payments, loyalty and credentials programs and mobile commerce.

In fact, 2012 will see much more attention paid to a range of other ways of handling the communications, credentials storage and commerce applications. The reason is simple enough: NFC simply has not gotten enough marketplace traction, and ecosystem participants are eager to move ahead.

It was inevitable that hype around near field communications would begin to ebb. That happens with all important new technologies. And one might argue the hype around NFC reached a peak in 2011.

Instead, we will likely see growing interest in cloud-based wallet solutions that can be used by current point-of-sale system, rather than requiring the use of a mobile phone.

PayPal, First Data and Visa are among the “big names” promoting retail solutions that do not require mobile phone involvement, and further integrate with online and possibly other devices such as connected game playing units or even video set-top boxes at some point.

Beyond that, the focus has broadened beyond the payment function, in part because of the time and expense required to create scalable solutions, and in part because the value of mobile payments, in a narrow sense, has yet to prove itself in the U.S. market.

Also, in an attempt to find a winning value proposition that drives massive end user and retailer uptake, most ecosystem participants are looking at any number of broader value propositions with elements of marketing, advertising, location-based couponing and dynamic inventory management, not just “payments.”

Could Fewer Wireless Providers Mean Lower Consumer Prices?

Economic models are all about the assumptions, and that applies to analyses of what should happen as additional spectrum is made available to U.S. wireless providers. Specifically, policymakers looking after the "public welfare" must make choices that could affect the amount of consumer benefit. 


The problem, as with virtually everything in the global mobile business or the global fixed network business, is the business terrain between monopoly on one hand and multiplicity on the other. Most policymakers globally have concluded that monopoly is, in fact, a poor way to encourage innovation, efficiency and lower prices. 


On the other hand, a simple spreadsheet exercise will be enough to convince anyone that the mobile or fixed network communications business, when conducted in a facilities based way, simply cannot support lots of contestants. 


Whatever you might suppose total demand is, when multiple providers start to divide up that demand, markets can become ruinous, meaning no contestant gets enough market share and revenue to sustain itself. 


The  Phoenix Center for Advanced Legal & Economic Public Policy Studies long has argued that the sustainable number of network-based contestants in either the wireless or fixed network business will be limited to just a few firms, for this reason. 


Phoenix Center Chief Economist George Ford now argues that consumers actually would be better off if any future wireless spectrum auctions allow all wireless providers to bid, rather than trying to ensure that spectrum assets are allocated more broadly.


This might seem counter-intuitive. If competition is better than a monopoly, shouldn't broader spectrum awards create more competition, and therefore lead to more innovation and lower retail prices?


That's the argument the Phoenix Center takes on in a new study. There are two key assumptions. 


"First, we assume that price falls as the number of competitors increases (e.g., the Hirschman Herfindahl Index or “HHI” falls)," says Ford. "More formally, we assume Cournot Competition in Quantities."


In other words, the Phoenix Center uses the same framework as the  the Federal Communications Commission and the Department of Justice, where it comes to assessing market concentration and the impact of competition on retail prices.


A second key assumption is important, though. The Phoenix Center does not assume the amount of capacity from spectrum is not linearly related to the amount of spectrum a firm has. 


That is, if we double the amount of spectrum, then the capacity provided to a firm from that additional spectrum more than doubles. That might be a head turner, at first. After all, are we not dealing here with laws of physics?


My apologies to Dr. Ford if I misapply the assumption, but here's how I'd explain it. 


Yes, laws of physics do apply. But wireless networks routinely "re-use" spectrum. A single physical allotment can be used repeatedly across a network, with a primary determinant being the coverage size of each cell. Lots of smaller cells can use a single amount of frequency more efficiently than a few big cells. 


But cutting the cell radius by 50 percent quadruples the number of required cells. And since each cell represents more capital investment, you see the issue. Spectrum does not linearly relate to effective end user bandwidth. The amount of actual bandwidth a network can provide is related to the amount of spectrum re-use.


"Richer" providers can better afford to create the denser smaller cell networks, so can provide more bandwidth from a fixed amount of spectrum. 

Wireless Competition Under Spectrum Exhaust provides the detailed model, but the point is that a smaller number of new spectrum recipients creates more effective end user bandwidth than a larger number of new recipients. That seems counter to reason, and the analysis is important for suggesting the "common sense" understanding is wrong. 


The important public policy implication is that rules to "spread the spectrum awards to more providers" has a negative impact on end user pricing. In fact, a more concentrated distribution should lead to increases in supply that more effectively lead to lower prices.


It is not what most might assume is the case. The policy implication is that it is not helpful to restrict the ability of any contestants, especially the dominant contestants, from acquiring more spectrum in new auctions. 


One might note that bidding rules in some countries, such as Germany, do in fact limit the amount of spectrum the dominant providers can acquire. Though the Phoenix arguments are about upcoming policy for U.S. spectrum auctions, the same analysis should apply in all markets. 

Saturday, February 18, 2012

EU to Clear Some 800 MHz Spectrum for LTE in 2012?

Every country in Europe will be required to clear TV transmissions out of the higher frequencies of 800MHz band by the end of 2012, the European Parliament has ruled.

That might not mean it actually happens that soon, but at least that's the goal. The expectation is that spectrum auctions then could follow, with networks being built after the completed auctions. All that means much of Europe will not see LTE in the next few years.

The 800MHz band is being cleared as part of the switch to digital television, freeing up some spectrum at the top and bottom of the band. The EU proposal concerns the higher frequencies in the 800-MHz band. By some estimates even that new spectrum will not be enough to meet mobile data demand by 2015. EU to clear 800 MHz band

Why LTE Kills Batteries

Devices running on Long Term Evolution and other 4G networks consume battery life, most users have discovered. Nokia Siemens Networks did some preliminary studies on LTE phone’s power drain versus their equivalent 3G models and found that LTE devices consume from five percent to 20 percent more than previous-generation phones, depending on the application used.

Some of you will instinctively guess that battery drain is worse than that.

In its review of the Samsung Galaxy Nexus, Engadget found that the Google Navigation running over the LTE network ate battery power faster than the Nexus’ car charger could restore it, for example. Why LTE drains batteries

Some us have started carrying extra batteries. Recently, some of us have been turning off both the 4G and 3G radios most of the time when out and about, using the devices only for voice and text.

And more of the time, the devices simply get turned off. That originally struck me as a complete waste of device capabilities. But we all learn to make trade offs. Increasingly, the only way to stretch battery life is simply not to use the data network at all, much of the time, so your batteries are available when you really need the power.


1% of Mobile Users Consume 1/2 of Bandwidth

A new study sponsored byArieso finds that extremely-heavy users of mobile bandwidth are becoming even heavier users.

About one percent of subscribers now consumes 50 percent of all downloaded data. Arieso reveals latest trends in smartphone data use:

1/2 of U.S. Adults Will Use Mobile Banking by 2016

By 2016, about half of U.S. adults will be using mobile banking, predicts. About 92 percent of the top-25 largest banks offer mobile banking, says Javelin. 


A study by Javelin Strategy and Research suggests that larger banks, armed with greater resources, have jumped into the mobile banking applications area at a level that small banks and credit unions have not generally been able to match, says Mary Monahan, Javelin Strategy and Research EVP and Research Director, Mobile.

Also, the complexity of mobile banking, with the many devices to support, as well as text messaging, mobile apps and web channels, smart phones, tablets and PCs, make it harder for smaller institutions to respond, says Monahan.

And there are key challenges to be faced. For one thing, younger consumers “are migrating to the larger banks” that do offer the mobile banking features, says Monahan. “As a result, the small bank clientele is older.” If younger customers are the bulk of future customers, you seen the danger.

About 11 percent of users have switched from smaller institutions to larger institutions, the study found. To be sure, about 20 percent of switchers say they moved because of “fees.”

But mobile banking users also tend to be younger, disproportionately in the 18 to 34 age bracket, and also tend to be wealthier, says Monahan. “They are more likely to have incomes over $100,000 a year, for example. And about half of tablet owners already are using mobile banking, suggesting that tablets will become an important new platform.

Of the top 25 banks, 30 percent already have developed tablet apps, the survey suggests. And Monahan notes that tablet apps have to be custom built for tablets, not ported over from existing PC apps.

As you might expect, users check balances, search for ATM locations and shift money between accounts. The coming new app, though, is peer-to-peer money transfer, and about 26 percent of banks already support that function in some way.

In many cases, users take advantage of that feature to do things such as splitting restaurant bills, for example. About 27 percent of survey respondents say they are interested in mobile P2P payments.

About 22 percent of institutions already support remote check deposit as well. But half the survey respondents say they will be adding remote check deposit within a year.

After a pause in 2010, mobile banking adoption surged by 63 percent  in 2011, rising to 57 million from 35 million U.S. adults, representing 22 million consumers in one year, according to a new study by Javelin Strategy and Research.

Over the next five years, mobile banking is projected to increase at a steady compound annual growth rate (CAGR) of 10.3 percent as financial institutions roll out new offerings and the pent-up backlog of demand is eased, says Monahan.

Smart phones are the immediate platform to be accommodated. Over the next five years, it is estimated that 68 million consumers will become new smart phone users, rising to 72 percent of the mobile phone user base. Smart phone adoption from 2011 to 2016 is projected to rise at a CAGR of 11.9 percent .

Smart phones currently drive mobile banking: Half of smart phone owners use mobile
banking versus 14 percent of non-smart phone owners, Javelin notes.

Tablets are the next frontier. The number of tablet users in the U.S. is expected to more than double over the coming year from its current base of 16 million (for an increase of 113 percent).

The number of adults using tablets is estimated to increase at a CAGR of 40.3 percent over the next five years.

By 2016, it is projected that 40 percent  of mobile consumers, or 87 million people, will have adopted a tablet.

And it is new applications both smart phones and tablets enable that could emerge as important new mobile banking capabilities, Javelin argues.

Video messaging provided by Microsoft Skype, video chat services such as Apple FaceTime and Google Talk allow for easy face-to-face messaging between devices that could provide much of the personal feeling of face-to-face communication.

Fears related to security and uncertainty about value are the main
roadblocks to initial consumer adoption of mobile banking. Also, perception of the value of mobile banking is a factor of age: Younger consumers are more likely to understand its worth, Javelin says.

Mobile P2P, mobile offers, mobile remote deposit, and all features that use the inherent nature of the phone will build the value proposition. And faster mobile networks will help. Lack of speed was the most common reason for dissatisfaction among customers who adopted the technology.

Will Else Will Apple Do to Support AI?

Apple is negotiating to use ChatGPT features in Apple’s iOS 18, according to a Bloomberg report . That raises the question of what else Appl...