Barnes & Noble has unveiled its new Nook tablet, a device with 8 GBytes of memory, a seven-inch screenn and priced at $199. That is an obvious positioning directly head to head with the Kindle Fire.
In addition, the company’s Nook Color e-reader has been repriced at $169. The new Nook tablet can be bought online or at Barnes & Noble retail locations.
Tuesday, February 21, 2012
New Nook Tablet
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, February 20, 2012
So Far, LTE is About PC Access, More than Phones
According to the Global Mobile Suppliers Association (GSA), there have been 49 Long Term Evolution network launches so far, and most have launched with an emphasis on PC connectivity, not use of smart phones. There also has been a big emphasis on what might be called fixed line substitution (if there was any widespread fixed line broadband to displace).
In large part, that reflects the relative paucity of LTE handsets available to sell.
In large part, that reflects the relative paucity of LTE handsets available to sell.
Some 285 service providers have committed to commercial LTE network deployments or are engaged in trials, technology testing or studies, the GSA reports.
The GSA report also confirms 226 firm commercial LTE network deployments.
Some 49 LTE networks, which is more than double the number 6 months ago, have launched commercial services in 29 countries: Armenia, Australia, Austria, Bahrain, Belarus, Brazil, Canada, Denmark, Estonia, Finland, Germany, Hong Kong, Hungary, Japan, Kuwait, Latvia, Lithuania, Norway, Philippines, Poland, Puerto Rico, Saudi Arabia, Singapore, South Korea, Sweden, UAE, Uruguay, USA, and Uzbekistan.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
66% of Users 24 to 34 Own Smart Phones
While overall smart phone penetration stood at 48 percent in January, those in the 24 to 34 age group showed the greatest proportion of smart phone ownership, at 66 percent.
In the same age group, 80 percent of those that had gotten a new device in the last three months chose a smart phone.
Among those who chose a device in the last three months, more than half of those under 65 had chosen a smart phone, by way of comparison.
Income also plays a significant role. When age and income are both taken into account, older subscribers with higher incomes are more likely to have a smart phone.
For example, those 55 to 64 making over $100,000 a year are almost as likely to have a smart phone as those in the 35 to 44 age bracket making $35,000 to $75,000 per year.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
120 MHz of 700 MHz Spectrum to Be Auctioned, Eventually
U.S. wireless service providers (and potentially others) soon will have the chance to bid on new wireless spectrum in the 700 MHz frequency range, and expected to be used to support new Long Term Evolution mobile networks.
The allocation is important for a couple of reasons.
First, it might be the last big block of new wireless spectrum to be allocated for some time. “This is going to be the largest block of spectrum made available to the public for mobile broadband purposes in the next few decades,” said Harold Furchtgott-Roth, a former member of the Federal Communications Commission. “Don’t see what else that is out there after this auction.”
Second, firms that do not win spectrum in the auction will have incentives to buy spectrum from other potential suppliers, especially Clearwire. Also, holders of some satellite spectrum that could be “re-purposed” for such purposes, notwithstanding the recent failure of LightSquared to win approval of its plan to re-use mobile satellite spectrum for a terrestrial Long Term Evolution network.
Consider that AT&T owns 114 MHz, Verizon about 172 MHz, Clearwire about 150 MHz in the top-10 U.S. cellular markets. An additional 120 MHz is significant.
The expected 120 MHz of spectrum has been authorized for release by the U.S. Congress, but the Federal Communications Commission still has to craft the bidding rules.
Nor is it immediately clear how soon auction rules could be approved, or how long it will take to clear broadcast television users out of the spectrum. Though broadcasters received use of that spectrum for free, they will be compensated to vacate the spectrum.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Why LightSquared Failed
Inevitably, despite the small possibility of some positive resolution, we now will see a period of reflection where observers try to explain "why LightSquared failed." That doesn't mean LightSquared has given up. But as some of us have been saying for a while, the big problem here is interference.
When the frequencies were originally awarded for mobile satellite use, what became the LightSquared spectrum was a "low-power" application, in terms of the transmitted downlink signals.
When the frequencies were originally awarded for mobile satellite use, what became the LightSquared spectrum was a "low-power" application, in terms of the transmitted downlink signals.
Mobile communications service is, by way of contrast, a "high-power application." And since all radio communications (digital or analog) is fundamentally a matter of signal-to-noise ratio, there are some physical locations (close to proposed cell sites) where the signal strength of the cell towers simply overpowers the received GPS signal.
This is physics, not politics. As originally designed, the satellite-based GPS network and the satellite-based mobile communications network could have co-existed, without interference, because both were low-power systems.
LightSquared has tried to paint the objections as a matter of politics and vested business interests. Those interests do exist. So one explanation for LightSquared's almost-certain failure (assuming one believes there still is a real possibility of fixing the interference issue) already can be sketched out.
"Entrenched and vested interests," including the GPS industry and some mobile telecom providers, were able to defeat LightSquared by political and financial assets brought to bear on the spectrum re-authorization process.
Others would note that the aviation industry and U.S. military also objected, though. No FCC commissioner is going to risk "an airliner falling out of the sky," or other risks to passenger safety.
LightSquared has tried to paint the objections as a matter of politics and vested business interests. Those interests do exist. So one explanation for LightSquared's almost-certain failure (assuming one believes there still is a real possibility of fixing the interference issue) already can be sketched out.
"Entrenched and vested interests," including the GPS industry and some mobile telecom providers, were able to defeat LightSquared by political and financial assets brought to bear on the spectrum re-authorization process.
Others would note that the aviation industry and U.S. military also objected, though. No FCC commissioner is going to risk "an airliner falling out of the sky," or other risks to passenger safety.
LightSquared needed an FCC waiver because it was trying to use spectrum allocated for low-power space-to-ground transmissions for high-power ground-only transmissions. Interference issues with adjacent low-power satellite apps are well understood, which is why two adjacent satellite bands originally were authorized. Why LightSquared failed
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, February 19, 2012
Big Change Coming for Mobile Payments in 2012, 2013
Enthusiasm about near field communications will be more muted in 2012 and most likely 2013, as mobile payments attention shifts to other ways to enable payments, loyalty and credentials programs and mobile commerce.
In fact, 2012 will see much more attention paid to a range of other ways of handling the communications, credentials storage and commerce applications. The reason is simple enough: NFC simply has not gotten enough marketplace traction, and ecosystem participants are eager to move ahead.
It was inevitable that hype around near field communications would begin to ebb. That happens with all important new technologies. And one might argue the hype around NFC reached a peak in 2011.
Instead, we will likely see growing interest in cloud-based wallet solutions that can be used by current point-of-sale system, rather than requiring the use of a mobile phone.
PayPal, First Data and Visa are among the “big names” promoting retail solutions that do not require mobile phone involvement, and further integrate with online and possibly other devices such as connected game playing units or even video set-top boxes at some point.
Beyond that, the focus has broadened beyond the payment function, in part because of the time and expense required to create scalable solutions, and in part because the value of mobile payments, in a narrow sense, has yet to prove itself in the U.S. market.
Also, in an attempt to find a winning value proposition that drives massive end user and retailer uptake, most ecosystem participants are looking at any number of broader value propositions with elements of marketing, advertising, location-based couponing and dynamic inventory management, not just “payments.”
In fact, 2012 will see much more attention paid to a range of other ways of handling the communications, credentials storage and commerce applications. The reason is simple enough: NFC simply has not gotten enough marketplace traction, and ecosystem participants are eager to move ahead.
It was inevitable that hype around near field communications would begin to ebb. That happens with all important new technologies. And one might argue the hype around NFC reached a peak in 2011.
Instead, we will likely see growing interest in cloud-based wallet solutions that can be used by current point-of-sale system, rather than requiring the use of a mobile phone.
PayPal, First Data and Visa are among the “big names” promoting retail solutions that do not require mobile phone involvement, and further integrate with online and possibly other devices such as connected game playing units or even video set-top boxes at some point.
Beyond that, the focus has broadened beyond the payment function, in part because of the time and expense required to create scalable solutions, and in part because the value of mobile payments, in a narrow sense, has yet to prove itself in the U.S. market.
Also, in an attempt to find a winning value proposition that drives massive end user and retailer uptake, most ecosystem participants are looking at any number of broader value propositions with elements of marketing, advertising, location-based couponing and dynamic inventory management, not just “payments.”
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Could Fewer Wireless Providers Mean Lower Consumer Prices?
Economic models are all about the assumptions, and that applies to analyses of what should happen as additional spectrum is made available to U.S. wireless providers. Specifically, policymakers looking after the "public welfare" must make choices that could affect the amount of consumer benefit.
The problem, as with virtually everything in the global mobile business or the global fixed network business, is the business terrain between monopoly on one hand and multiplicity on the other. Most policymakers globally have concluded that monopoly is, in fact, a poor way to encourage innovation, efficiency and lower prices.
On the other hand, a simple spreadsheet exercise will be enough to convince anyone that the mobile or fixed network communications business, when conducted in a facilities based way, simply cannot support lots of contestants.
Whatever you might suppose total demand is, when multiple providers start to divide up that demand, markets can become ruinous, meaning no contestant gets enough market share and revenue to sustain itself.
The Phoenix Center for Advanced Legal & Economic Public Policy Studies long has argued that the sustainable number of network-based contestants in either the wireless or fixed network business will be limited to just a few firms, for this reason.
Phoenix Center Chief Economist George Ford now argues that consumers actually would be better off if any future wireless spectrum auctions allow all wireless providers to bid, rather than trying to ensure that spectrum assets are allocated more broadly.
This might seem counter-intuitive. If competition is better than a monopoly, shouldn't broader spectrum awards create more competition, and therefore lead to more innovation and lower retail prices?
That's the argument the Phoenix Center takes on in a new study. There are two key assumptions.
"First, we assume that price falls as the number of competitors increases (e.g., the Hirschman Herfindahl Index or “HHI” falls)," says Ford. "More formally, we assume Cournot Competition in Quantities."
In other words, the Phoenix Center uses the same framework as the the Federal Communications Commission and the Department of Justice, where it comes to assessing market concentration and the impact of competition on retail prices.
A second key assumption is important, though. The Phoenix Center does not assume the amount of capacity from spectrum is not linearly related to the amount of spectrum a firm has.
That is, if we double the amount of spectrum, then the capacity provided to a firm from that additional spectrum more than doubles. That might be a head turner, at first. After all, are we not dealing here with laws of physics?
My apologies to Dr. Ford if I misapply the assumption, but here's how I'd explain it.
Yes, laws of physics do apply. But wireless networks routinely "re-use" spectrum. A single physical allotment can be used repeatedly across a network, with a primary determinant being the coverage size of each cell. Lots of smaller cells can use a single amount of frequency more efficiently than a few big cells.
But cutting the cell radius by 50 percent quadruples the number of required cells. And since each cell represents more capital investment, you see the issue. Spectrum does not linearly relate to effective end user bandwidth. The amount of actual bandwidth a network can provide is related to the amount of spectrum re-use.
"Richer" providers can better afford to create the denser smaller cell networks, so can provide more bandwidth from a fixed amount of spectrum.
The problem, as with virtually everything in the global mobile business or the global fixed network business, is the business terrain between monopoly on one hand and multiplicity on the other. Most policymakers globally have concluded that monopoly is, in fact, a poor way to encourage innovation, efficiency and lower prices.
On the other hand, a simple spreadsheet exercise will be enough to convince anyone that the mobile or fixed network communications business, when conducted in a facilities based way, simply cannot support lots of contestants.
Whatever you might suppose total demand is, when multiple providers start to divide up that demand, markets can become ruinous, meaning no contestant gets enough market share and revenue to sustain itself.
The Phoenix Center for Advanced Legal & Economic Public Policy Studies long has argued that the sustainable number of network-based contestants in either the wireless or fixed network business will be limited to just a few firms, for this reason.
Phoenix Center Chief Economist George Ford now argues that consumers actually would be better off if any future wireless spectrum auctions allow all wireless providers to bid, rather than trying to ensure that spectrum assets are allocated more broadly.
This might seem counter-intuitive. If competition is better than a monopoly, shouldn't broader spectrum awards create more competition, and therefore lead to more innovation and lower retail prices?
That's the argument the Phoenix Center takes on in a new study. There are two key assumptions.
"First, we assume that price falls as the number of competitors increases (e.g., the Hirschman Herfindahl Index or “HHI” falls)," says Ford. "More formally, we assume Cournot Competition in Quantities."
In other words, the Phoenix Center uses the same framework as the the Federal Communications Commission and the Department of Justice, where it comes to assessing market concentration and the impact of competition on retail prices.
A second key assumption is important, though. The Phoenix Center does not assume the amount of capacity from spectrum is not linearly related to the amount of spectrum a firm has.
That is, if we double the amount of spectrum, then the capacity provided to a firm from that additional spectrum more than doubles. That might be a head turner, at first. After all, are we not dealing here with laws of physics?
My apologies to Dr. Ford if I misapply the assumption, but here's how I'd explain it.
Yes, laws of physics do apply. But wireless networks routinely "re-use" spectrum. A single physical allotment can be used repeatedly across a network, with a primary determinant being the coverage size of each cell. Lots of smaller cells can use a single amount of frequency more efficiently than a few big cells.
But cutting the cell radius by 50 percent quadruples the number of required cells. And since each cell represents more capital investment, you see the issue. Spectrum does not linearly relate to effective end user bandwidth. The amount of actual bandwidth a network can provide is related to the amount of spectrum re-use.
"Richer" providers can better afford to create the denser smaller cell networks, so can provide more bandwidth from a fixed amount of spectrum.
Wireless Competition Under Spectrum Exhaust provides the detailed model, but the point is that a smaller number of new spectrum recipients creates more effective end user bandwidth than a larger number of new recipients. That seems counter to reason, and the analysis is important for suggesting the "common sense" understanding is wrong.
The important public policy implication is that rules to "spread the spectrum awards to more providers" has a negative impact on end user pricing. In fact, a more concentrated distribution should lead to increases in supply that more effectively lead to lower prices.
It is not what most might assume is the case. The policy implication is that it is not helpful to restrict the ability of any contestants, especially the dominant contestants, from acquiring more spectrum in new auctions.
One might note that bidding rules in some countries, such as Germany, do in fact limit the amount of spectrum the dominant providers can acquire. Though the Phoenix arguments are about upcoming policy for U.S. spectrum auctions, the same analysis should apply in all markets.
The important public policy implication is that rules to "spread the spectrum awards to more providers" has a negative impact on end user pricing. In fact, a more concentrated distribution should lead to increases in supply that more effectively lead to lower prices.
It is not what most might assume is the case. The policy implication is that it is not helpful to restrict the ability of any contestants, especially the dominant contestants, from acquiring more spectrum in new auctions.
One might note that bidding rules in some countries, such as Germany, do in fact limit the amount of spectrum the dominant providers can acquire. Though the Phoenix arguments are about upcoming policy for U.S. spectrum auctions, the same analysis should apply in all markets.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...