Monday, August 15, 2016

When Will 4K "Better Image Quality" Actually Make a Difference?

If 4K displays actually are in use by as much as 10 percent of the U.S. population, by 2021, it is reasonable to ask what is the “driver” of adoption. “Better image quality” is the standard, but possibly facile, answer.

Consider a 4K display on a smartphone. As any TV engineer will tell you, unless you are very close to the screen, the human eye cannot discern the difference between a picture at HDTV and 4K coding.

But eyes are close to smartphone screens, so we ought to be able to “see” the difference, yes? Maybe not. Some would argue that human eye cannot tell the difference, much of the time, between HDTV and 4K content, even when viewed up close on a smartphone.

The same problem exists for 4K when used on larger TV displays. One has to sit closer than perhaps seven to eight feet from a 65-inch display to have any chance of perceiving the difference between HDTV and 4K picture quality. Few of us will do so.

So if consumers really cannot “see” the difference, where is the value driver? It really is not “picture quality,” since small devices and large TV displays will not be able to show those picture quality improvements.

There are, of course, other drivers of value. For some people, having a big 4K TV is a status symbol. There, the value is the perceived status, not the quality of the picture.

Historically, one can argue, there always has been a tension between image quality and content richness as drivers of consumer spending on entertainment video.

Consider streaming video services consumed on mobile and other small screen devices: image quality, per se, is not the driver. Content access “anywhere” is the adoption driver, since image quality on a mobile is limited, compared to what is available on a TV screen.

Even in a standard TV screen experience, much streaming content is consumed in standard, rather than high definition format. So it is not “image quality,” in and of itself, that is the adoption driver.

The same might be true for 4K and future 8K TV services. They will be touted as “better” because of image quality, even when not all deployment scenarios can people actually see the quality differences.

Juniper Research predicts that revenues from subscription video on demand services, such as Netflix and Amazon, are set to more than double from $14.6 billion in 2016, to $34.6 billion in 2021.  

Netflix already has U.S. subscriber numbers level with leading network providers DirecTV & Comcast (47 million and 47.7 million respectively).

In the end, 4K image quality is not just hype. You can tell the difference, on some content, on some devices, some of the time, if you are close enough to the screen. But few large-screen apps will allow people to “see the difference,” because they will not be sitting close enough.

Occasionally, smartphone users with 4K displays might discern some quality improvement, but “better image quality” will not be consistent. Most people are not going to notice.

So consumer beware, if you are paying significantly more money for 4K.

"Fiber to the Light Pole" Might be the Required Backhaul Network for Millimeter Wave Access Networks

If, as expected, millimeter wave small cells have a transmission radius of about 50 meters (165 feet) to 200 meters (perhaps a tenth of a mile), it is easy to predict that an unusually-dense backhaul network will have to be built (by mobile network standards).

In the past, mobile operators have only required backhaul to macrocells to towers spaced many miles apart. All that changes with new small cell networks built using millimeter wave spectrum (either for 5G mobile or fixed use, or for ISP fixed access).


Keep in mind that street lights are spaced at distances from 100 feet (30.5 meters) to 400 feet (122 meters) on local roads.


As a rough approximation, think of a small cell, in a dense deployment area, spaced at roughly every other street light, up to small cells spaced at about every fourth light pole.


That suggests the sort of dense backhaul network that also will be required. You can argue that a new “fiber to the light pole” network must be built. You can argue that a new mesh backhaul network must be built. You can argue that some other leased backhaul (cable TV network) could be feasible.


In all cases, there are potential business model costs in the backhaul and small cell transmission network that exceed anything engineers have had to design, yet. That is why ots of people now are asking very-practical questions about millimeter wave spectrum and its potential impact on access network business models.


People want to know how far signals will reach, how much rain or snow will affect signal levels, how signals will bend or otherwise get around line of sight issues and how backhaul will be provided.


Impact on the business model for existing and new Internet service providers lies at the heart of those questions. And those are important questions.


With some 29 GHz of new spectrum for communications set for release by the Federal Communications Commission (including 7 GHz or more of unlicensed spectrum, spectrum sharing set to add additional spectrum in the 3.5-GHz band, there are potentially-disruptive changes in network costs, revenues and competition in the works.


What remains unknown is how much propagation distances might change as 28 GHZ is adapted for small cell network architectures, instead of point-to-point links. In an earlier period, reach of 1.5 miles was routine for point-to-point links, and distances up to three to five miles sometimes were possible.


In a new small cell deployment, transmitting at lower power, distances of 1,000 meters (about 0.6 miles) might be possible. Others think reasonable distances will more likely be in the 50 meters to 200 meters range.


Potential bandwidth is among the key differences between bandwidth in the below-6 GHz frequency bands and those millimeter bands in the 24-GHz and higher bands.


Simply, compared to 2-GHz (mobile) or 3.5-GHz signals, potential bandwidth is from five times to an order of magnitude higher. The trade-off is propagation distance.


The bandwidth differences are based on frequency itself: Basically, the waves oscillate between their high and low states much more often as frequency increases. And, in principle, every oscillation is the foundation for representing a physical bit.


On the other hand, as frequency increases, the waves start to act more like particles, in a sense, and are affected by physical objects, which stop them, and by oxygen and moisture in the atmosphere, which absorb them.


Those trade-offs mean it is not easy to model the potential business impact of abundant millimeter wave spectrum on business models.




“The beauty of these frequencies is that the new bandwidth they make available is tremendously large,” said Alpaslan Demir, InterDigital principal engineer. “You are talking about Mbps or multiples of- 100 Mbps bandwidths, with up to 2 GHz bandwidths, or multiples of 2 GHz, especially at 70 GHz.”


“The definition of capacity should not be limited to bps/Hz but it should involve space as another dimension,” he also argues. “For example, if there are 50 links deployed over one sq km, each with 10 Gbps over 2 GHz channel bandwidth, then the total capacity can be defined as 500 Gbps/sq km.”


The bottom line for millimeter wave access networks: lots of bandwidth but limited physical reach.

Also, in the U.S. market, 7 GHz of unlicensed spectrum will be released, with obvious impact on the cost of such an access network.

The business model issues, aside from the new small cell radio network, is the requirement for some sort of "fiber to the light pole" backhaul network. It doesn't have to be optical fiber as the physical medium, but tht's the right sort of thinking about the density of a millimeter wave access network backhaul requirement.


As always, cost and revenue will get a hard look. In fact, the ability for a mobile operator to create a new fixed Internet access business--using the same infrastructure required to support mobility services--is one major reason fixed access likely will be a key feature of 5G mobile networks.


New revenue is required to pay for the new dense small cell networks, and cannibalizing fixed Internet access might be one way of doing so.

Sunday, August 14, 2016

Prediction: Rural ISP Services Will "Never" Earn a Positive and Direct Financial Return...Really....Never

It always has been difficult to offer the same types and levels of communications and some other services in rural and isolated locations, anywhere around the globe.

The reason mobile networks “solved” the problem of “people not being able to make phone calls” happened because we found a network platform that was less expensive than the fixed networks that were for many decades the only option.

Such platform economics explain why most observers believe mobile will be the platform that finally connects “most people” around the world to the Internet.

But we sometimes are reminded that even in developed countries, rural business cases remain very difficult. That, in fact, is why governments generally subsidize communications services in rural areas.

source: techneconomyblog.com

Less often do we think about the fact that even mobile networks have clear divergences in profitability from place to place. And even a mobile operator does not expect to actually recover costs--much less make an actual profit--from those cells that serve customers in very-rural and isolated areas.

As likely is true in most industries, a disproportionate share of firm profits are generated by a relatively small number of customers, with likely losses among some customer groups.

That is one reason why new developments in access technology and platforms are so important: to sustain high-quality services and build new networks, we constantly must grapple with the likelihood that perhaps as much as half of locations actually are “money losers.”

That poses a key sustainability challenge for any operation that actually has to recover its costs and make enough profits to stay in business. In the mobile business, that “half the network doesn’t make much--if any--money” is a reality.

So if you think about it, that is a key problem for service providers and policymakers who actually want everyone to have Internet access. It might literally be true that most such networks, operating in rural areas with low population density, “will never actually make money.”
Fitzgerald Analytics

If so, only subsidies of one sort or another are going to enable universal Internet access, no matter how good our platforms are improving.

In other words, there might simply not enough demand in many rural areas to support new high-quality Internet access networks. There might not be enough people, willing to pay enough, to actually earn a return on rural access networks.

That doesn’t mean the networks will not be built. It simply means they will be operated at a loss, with subsidies coming from somewhere else. Government support programs likely will play a role. But profits earned in other parts of the ISP’s business are going to be equally--if not more--important.

That is why the business model for Internet access always is a key focus of the Spectrum Futures conference. Here’s a  fact sheet and Spectrum Futures schedule.



Friday, August 12, 2016

RS Fiber Cooperative Bringing Fiber to the Farm

Cooperatives long have been a way rural communities organize themselves to supply electricity or communications to their members. Now more communities might look at cooperatives to build Internet access infrastructure.
With the caveat that for every good public purpose there are corresponding private interests, and with the further caveat that many are skeptical of situations where government entities compete directly with private entities, there are arguably fewer such sharp economic or political objections if the enterprise takes the form of a cooperative.
To be sure, such a capital-intensive endeavor often requires seed funding from local governments able to issue bonds that covered half of the approximately $16 million required for the project’s first phase. So even cooperatives are not free of the criticism that they represent a form of government-subsidized competition.
Still, in principle, non-governmental free associations of citizens should have the right to create cooperatives for electricity or communications. It is not easy, and many similar efforts have failed.
Even if one generally agrees that governments should not provide services that private entities can supply, if customers are not happy, and want to form cooperatives, it is hard to argue they should not have the right to do so. It is quite difficult, and perhaps rarely can happen in a “pure” sense, with no financial support of any sort from any local unit of government.
But it also is hard to argue that allowing multiple forms of competition is a “bad thing,” where it comes to providing high-quality Internet access in hard-to-serve areas where there actually is not a traditional business case.
In fact, that is the whole rationale behind subsidies for providers of telecom service in rural areas, for example. RS Fiber provides an example of a cooperative that is an association of rural cities and towns, not an independent cooperative that might be more palatable to some.

More competition in a competitive industry is rarely welcomed by incumbents. But one way or another, more competition is coming. We haven’t yet seen the full extent of what might be possible.  

Trend Still Holds: U.S. Telecoms Replace 1/2 Their Revenue Sources Every Decade

Some 12 years ago, US mobile data revenues were less than five percent of overall mobile industry revenues. In the second quarter of 2016,  mobile data revenue crossed the 75 percent threshold, according to analyst Chetan Sharma.
That is an important observation for several reasons beyond the obvious importance of mobile data as a revenue source.
As a fundamental analytical principle, I have argued for several decades that service providers must expect to replace half their current revenue about every decade, from new sources.
That has proven true In the U.S. telecom business for several decades, where lead revenue sources have, in fact, been replaced, about every decade.
In 1997 about 16 percent of revenues came from mobility services. In 2007, more than 49 percent of end user revenue came from mobility services, according to Federal Communications Commission data.
Likewise, in 1997 more than 47 percent of revenue came from long distance services. In 2007 just 18 percent of end user revenues came from long distance.
So the latest estimate by Chetan Sharma suggests the process still is at work. In turn, long distance revenue; then mobile; and now mobile data has become the key industry revenue driver.

The question is which new major revenue source will drive the next displacement? One suggestion is that the intense interest in Internet of Things and machine-to-machine communications signals a widespread belief that this is where the big new industry revenue will come from.

Google Suspends Several Google Fiber Builds; Fixed Wireless the Reason?

It is too early for those of us outside Google to figure out what it means that Google Fiber has halted or suspended its plans to build Google Fiber networks in San Jose, Calif., Palo Alto, Calif. and Mountain View, Calif.
The stated reason for the pause is to allow Google Fiber to explore whether fixed wireless is a better option for building those networks. Construction speed and cost as well as lower overall capital investment are potentially major advantages.
Google apparently believes its gigabit speed offer can be offered using fixed wireless, instead of optical fiber.
Some skeptics might argue that the whole purpose of Google Fiber was to goad other competitors into upgrading to gigabit speeds--something that is happening nationally--and never to become a major ISP in its own right.
Whatever the outcome, it now is becoming clearer that physical media choices are increasing, where it comes to gigabit networks. In addition to fiber to the home, cable operators are doing so with their hybrid fiber coax networks, and it now appears fixed wireless and mobile networks (5G) will be able to do so, as well.
To an astounding degree, that is a major change in platform capabilities and potential business models. Until recently, it was generally assumed that only fiber to the home could support commercial and widespread gigabit speeds.
That will be less the case in the coming years, as 5G networks are deployed--in fixed and mobile variants--and as much as 29 GHz of new communications spectrum, including at least 7 GHz of unlicensed spectrum is released for use in the U.S. market.
AT&T says it has achieved speeds up to 14 Gbps using millimeter wave radio in what appears to be a point-to-point application, and speeds up to 5 Gbps to two users, in what appears to be point-to-multipoint application.
That test appears to have used 15-GHz frequencies. AT&T says it now will test propagation at 28 GHz.
Separately, Google has asked the Federal Communications Commission for authorization to conduct radio experiments in the new Citizens Broadband Radio Service  (CBRS) band, at 24 U.S. locations.
That is important for several reasons. First, the CBRS is the first U.S. frequency band to feature shared spectrum access: commercial users and licensed government users will share access to bandwidth.
Second, CBRS will be a major new way for Google--and other ISPs--to provide Internet access services, beyond Google Fiber.
Third, the move suggests the coming important role of fixed wireless in the U.S. ISP business.
Google plans to deploy initially in Atwater, Calif., Mountain View, Calif., Palo Alto, Calif., San Bruno, Calif., San Francisco, San Jose, Calif., Boulder, Colo., Kansas City, Kan., Omaha, Neb., Raleigh, N.C., Provo, Utah, and Reston, Va.
Those locations skew heavily to major urban areas near Google’s headquarters, some sites where Google Fiber already operates, but also some new smaller-market locations.
The initial test locations also indicate Google wants to test interference issues in areas where licensed users are active (coastal regions are issue for some licensees).
Google apparently also is looking at locations where it already operates Google Fiber, potentially adding a new access technology option to the current fiber-to-home approach.
Google says “operations vary from 7 km to 40 km from the geographic center point of each test area.” That implies potential testing of signal propagation and interference testing ranging from four miles to nearly 25 miles.

The test locations are not commercial launch sites, Google says.

AT&T Demos 14 Gbps at 15 GHz

AT&T says it has achieved speeds up to 14 Gbps using millimeter wave radio in what appears to be a point-to-point application, and speeds up to 5 Gbps to two users, in what appears to be point-to-multipoint application.
That test appears to have used 15-GHz frequencies. AT&T says it now will test propagation at 28 GHz.
Separately, Google has asked the Federal Communications Commission for authorization to conduct radio experiments in the new Citizens Broadband Radio Service  (CBRS) band, at 24 U.S. locations.
That is important for several reasons. First, the CBRS is the first U.S. frequency band to feature shared spectrum access: commercial users and licensed government users will share access to bandwidth.
Second, CBRS will be a major new way for Google--and other ISPs--to provide Internet access services, beyond Google Fiber.
Third, the move suggests the coming important role of fixed wireless in the U.S. ISP business.
Google plans to deploy initially in Atwater, Calif., Mountain View, Calif., Palo Alto, Calif., San Bruno, Calif., San Francisco, San Jose, Calif., Boulder, Colo., Kansas City, Kan., Omaha, Neb., Raleigh, N.C., Provo, Utah, and Reston, Va.
Those locations skew heavily to major urban areas near Google’s headquarters, some sites where Google Fiber already operates, but also some new smaller-market locations.
The initial test locations also indicate Google wants to test interference issues in areas where licensed users are active (coastal regions are issue for some licensees).
Google apparently also is looking at locations where it already operates Google Fiber, potentially adding a new access technology option to the current fiber-to-home approach.
Google says “operations vary from 7 km to 40 km from the geographic center point of each test area.” That implies potential testing of signal propagation and interference testing ranging from four miles to nearly 25 miles.

The test locations are not commercial launch sites, Google says.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...