Friday, December 12, 2014

Can European Telcos Return to Growth in 2016?

Service provider revenue in Europe has fallen consistently since 2009. That is not completely unusual. In five of six years since 2009, service provider revenues also have fallen in Japan.

So will the European telecommunications business return to growth by 2016? Analysts at IDATE say that will happen, although the impact of what appears to be a threat of recession in Europe might make that a daunting prospect.

The impact of the Great Recession beginning in 2008 is easy enough to describe. According to TeleGeography Research, revenue growth slipped from about seven percent annually to one percent in 2009, returning to about three percent globally in 2011.

So a new recession in Europe could well wipe out expected revenue gains expected to be in low single digits in European markets between 2015 and 2016.

Earlier in 2014, economists expected European Union economic growth of between 1.5 percent and 1.75 percent. But growth rates have dipped lower most of 2014, and might go negative in the fourth quarter.

Telecommunications service revenues were four percent lower in 2013 than in 2012, but the revenue decline should slow to perhaps -1.8 percent for 2014, with 2015 a transition year, IDATE expects. In 2016, European Union telecom revenue is projected to reach one percent for the year.

But a new recession could dash those hopes.   

In the EU28, mobile average revenue per user will have lost some 25 percent of its value between 2008 and the end of 2014. “Hence, in spite of the fact that mobile penetration
stood at 129 percent of population in europe in 2013 and will keep growing, operators earn less revenues year after year,” IDATE says.

Fixed high speed access average revenue per user was EUR 24.9 per month in 2008, declining steadily to EUR 22.3 per month in 2014. Fixed network voice revenue has been falling for about 14 years.

The IDATE study, sponsored by ETNO, the organization representing European telecom service providers, notes that 2014 revenue will fall about four percent, following at 2.9 percent decline in 2013.

Even mobile Internet access, a growth category in most markets, dipped 1.8 percent in 2014, though a better performance than the 4.5 percent decline in 2013.

Mobile remains the dominant form of Internet access, with the number of subscriptions approaching the 800 million mark in Europe. Mobile Internet access subscribers grew 0.9 percent in 2013 and likely will grow slightly by the end of 2014 as well.

Fixed broadband subscriptions grew from 157.7 million in 2012 to 163.8 million in 2013 and are expected to stand at 170 million by the end of 2014.

According to IDATE, at the end of 2014, for the first time, fixed broadband subscriptions will outnumber traditional circuit-switched fixed lines.

The big takeaway, though, is that the IDATE forecast will bump up against a likely recession in 2015 that is likely to depress revenues. All that is required is for a recession to shave one percent to two percent off service provider revenue growth rates.

Thursday, December 11, 2014

FCC Boosts Connect America Fund Minimum Internet Access Speed to 10 Mbps

The U.S. Federal Communications Commission has voted to change the definition of “broadband” speeds required for receipt of Connect America funding from 4 Mbps downstream and 1 Mbps upstream to a minimum of at least 10 Mbps downstream and 1 Mbps for upstream speeds.

The change boosts minimum downstream speeds that originally were set in 2011. The Commission’s thinking is that the new standard better reflects minimum speeds received by
99 percent of U.S. residents living in urban areas.

Most locations can buy service at far-higher speeds, however. In fact, 86 percent of U.S. households can buy service at speeds from 25 Mbps to 50 Mbps at the end of 2013.

The new rules affect eligibility for CAF monies of about $1.8 billion a year, for the purpose of providing high speed access to five million rural residents.

Is Linear Video Business Finally at a "Crossroads?"

Is the U.S. multi-channel subscription video business finally “at a crossroads?” A new survey of video consumers might suggest the rate of change might be accelerating.

On the other hand, the survey might also suggest the big immediate danger exists across the board, in every age category between 18 and 59.

In fact, respondents 18 to 34 appear to buy linear subscription video at higher rates than consumers 35 to 59, a finding that might suggest demand has weakened across the entire age range between 18 and 59.

Some might argue the greatest danger can be found among younger viewers 18 to 34, as the one-year drop in buying was highest for those respondents, while buying was consistent for age groups 39 to 59: either flat or slightly higher.  

Cable TV (or telco TV or satellite TV) subscriptions among viewers 18 to 24, and those 25 to 34,  dropped six percent from 2013 to 2014, according to PwC.

But consider that subscription rates for consumers in older demographics was even lower than among those 18 to 34.

At 71 percent adoption, respondents 18 to 24 had the highest rates of purchasing, at 71 percent of respondents in that age cohort. Among those 25 to 34, the take rate was just 67 percent.

Among respondents 35 to 49, an even lower 64 percent of respondents say they buy linear video subscription services. Among users 50 to 59, the take rate was 67 percent.

The cable subscription figures are steadier for older demographics, though.

Looking ahead, 91 percent of viewers said they see themselves subscribing to cable in one year.  
Some 61 percent of respondents said they would likely subscribe in five years, and only 42 percent thought they would be buying in a decade.

So far, the survey found, use of on-demand, over the top streaming services remains largely ancillary to purchasing of linear video services.

Some 65 percent of linear video subscribers 18 to 24 used Netflix in 2014. About 71 percent of linear TV subscribers 25 to 34 had Netflix in 2014, up from 51 percent in 2013.

Of respondents 35 to 49, 66 percent of linear video buyers also bought Netflix. Of those 50 to 59, 58 percent bought Netflix, the PwC study reports.

The PwC survey, if it accurately reflects the actual population of consumers as a whole, might be interpreted as suggesting demand for linear video is actually a bit stronger among consumers 18 to 34 than among consumers 35 and older.

What is not so clear is “what” is at risk. The common perception is that it is the “linear” format or “bundling” that is endangered. That might not be the case for all programming formats.

Sports is the best example of linear delivery that is most valuable, and least susceptible to disruption by non-real-time alternatives. Movies and TV series clearly are the most at risk to on-demand delivery.

But bundling might not actually be endangered quite so much. Some light users will prefer full a la carte access. But most multi-user households and sports fans likely will find that the cost of a full a la carte approach actually costs more than what they currently pay for their video subscriptions.

As always is the case, price and value will matter. Much depends on how content owners decide to allow retail sales of their content, and what steps distributors might take in response.

Should content owners conclude they prefer to fully unbundle access, without requiring prior purchase of a linear video subscription, major instability could erupt in the video ecosystem.

On the other hand, if content owners stick with the current system, where a consumer generally must first buy a linear subscription to get on-demand access to some or all of the content available from any single programming channel, there would be less channel conflict, but likely also less development of a new distribution system and consumer choice.

Nor is it clear retail prices might be lower, in all cases, or profit margins higher, in all cases, for content owners. Today, programming networks other than the TV broadcast channels rely on a  wholesale role.

Few channels sell directly to end users, with the exception of local broadcast stations, as a class of suppliers.

A shift to a retail role would entail lots of new cost, especially for marketing and promotion. The new business models would be highly dependent on how those new retail selling costs changed gross revenue, profit margins and operating cost for the content owners.

Nor is it completely possible to predict the reaction of their current distributors (cable, satellite and telco TV providers). Faced with a major move to over the top access on the part of the programmers, the distributors would react.  

The point is that, even with the whole industry in a process of incremental movement, one might argue the crossroads actually remains in the future.

Not Even Apple Pay Will Quickly Solve "Mobile Payments" Problems

Some of us have argued that it could well take a decade or more before mobile payments become a routine part of the consumer experience when paying for merchandise at a retail location. And even then, “routine” use might be a reality for only about half of all consumers.


After 20 years, the percentage of U.S. households using automatic bill paying is still only about 50 percent. Likewise, after 20 years, use of debit cards by U.S. households is only about 50 percent.


It took about a decade for use of automated teller machines to reach usage by about half of U.S. households.


So history is the reason it is reasonable to predict that mobile payments will not be a mass market reality for some time.


Some might argue the problem is that big companies cannot innovate. Actually, that might be a problem, but is only a small part of the adoption process.


The bigger problem is that major changes in end user behavior have to happen, and before that can become a reality, it often is necessary to spend quite significant sums to create the infrastructure enabling the behavior change.


In the case of mobile payments, that involves creating a critical mass of devices, payment apps and processes, merchant terminals and retail brands. Beyond that, the developing market would have to come to a practical consensus about standards, interfaces and methods.


Also, with huge amounts of revenue at stake, it will take some time to sort through rival business interests and approaches that pit credit card issuers against retailers, for example.


All of that ensures a lengthy period of confusion before scale is possible. And until scale is possible, progress will be limited.

In consumer financial services, decades can pass before a significant percentage of consumers use an innovation. In fact, a decade to reach 10 percent or 20 percent adoption is not unusual, in the consumer financial services space.


Gigabit Access Improves User Experience, But Only So Much

The growing embrace of faster Internet access in the U.S. market, and the growing supply of faster connections, despite all criticisms, has many drivers--on both the demand and supply side. But it is changes on the supply side that arguably are most important.

In the past, Internet service providers rightly have suggested there was not much demand for faster access (50 Mbps, 100 Mbps, 300 Mbps or gigabit Internet access), but that was in large part primarily because of retail prices.

It would have been more accurate to say there was, in the past, not much demand for Internet access costing $100 to $300 a month, when “good enough” services could be purchased for $40 to $60 a month.

All that is changing. When Google Fiber launched Google Fiber, offering a symmetrical access service for $70 a month, the key innovation might arguably have been the price.

True, Google Fiber offers very-fast access (two orders of magnitude faster than typical offers, and an order of magnitude faster than 300-Mbps services that had been available in some communities.

But the key innovation arguably was the price. A gigabit for $70 a month is not such a leap from the $50 a month level of “standard” Internet access.

Also, $70 a month is a huge change from the $300-a-month price of prior 300 Mbps offers, or $350 monthly prices for a few gigabit services that had been available.

So major changes on the supply side--dramatically lower prices and dramatically faster speeds--are spurring demand.  

In part, that is because the underlying technology is getting better.

“I joined AT&T in 2008 and I remember around 2012 looking at some charts and the cost of speed hadn’t really had a breakthrough, because 80 percent of your deployment in broadband is labor based,” said John Donovan, AT&T senior executive vice president for architecture, technology and operations.

“And then all of a sudden you have vectoring in small form factor stuff and all of a sudden a little bit of an investment by our supply chain a few standard things and we start to take a 25 meg on a copper pair and then we move it to 45 and then 75 and then 100 which is on the drawing board,” said Donovan.

The point is that the underlying technology used by cable TV operators and telcos has been continually improved, providing better performance at prices useful for commercial deployment.

Operating practices also are becoming more efficient. Google Fiber has been able to work with local governments to streamline permitting processes and other make-ready work in ways that can lower costs to activate a new Internet access network using fixed media.

Google Fiber also pioneered a new way of building networks, getting users to indicate interest before construction starts, and building neighborhood by neighborhood, instead of everywhere in a local area.

That changes gigabit network economics. As has been true for nearly a couple of decades in the U.S. market, for example, competitive suppliers have been able to “cherry pick” operations, building only enough network to reach willing customers, without the need to invest capital in networks and elements that “reach everyone.”

That makes a big difference in business models. A network upgrade that might not have made sense if applied across a whole metro network might well make sense in some parts of a city, where there is demand.

Also, every new supplier of Internet access goes through a learning curve, generally operating inefficiently at first, but improving as experience is accumulated.

“And then we are getting better at the deployment side of the business as well,” said Donovan. “So our average technicians and our best technicians are converging.”

But there is an important related issue. Customers who get gigabit service often cannot perceive a difference in experience as great as they might have expected.

“When we do the installs, we often have to stay and show them on a speed test is getting a gig,” said Donovan. That illustrates a problem we are going to be seeing more often, namely that gigabit access can only improve end user experience so much.

Access speed only improves experience so much because it is only part of the combined ecosystem, and only can affect a part of user experience. Remove the local access bottleneck on one end and all the other elements become visible.

Bragging rights and therefore marketing advantages are believed to accrue to Internet service providers with the highest perceived Internet access speeds.

Some--if not most--of that marketing hype apparently is misplaced, a new study by Ofcom, the U.K. communications regulator, might suggest.

The study found that “access speed” matters substantially at downstream speeds of 5 Mbps and lower. In other words, “speed matters” for user experience when overall access speed is low.

For downstream speeds of 5 Mbps to 10 Mbps, the downstream speed matters somewhat.

But at 10 Mbps or faster speeds, the actual downstream speed has negligible to no impact on
end user experience. Since the average downstream speed in the United Kingdom now is about 23 Mbps, higher speeds--whatever the perceived marketing advantages--have scant impact on end user application experience. Some 85 percent of U.K. fixed network Internet access customers have service at 10 Mbps or faster.

Investing too much in high speed access is, as a business issue as investing too little. The important insight is that it is perception that now matters most in the United Kingdom and United States, not the actual threshold required to provide reasonable end user experience of Internet applications such as web browsing and streaming video.

Average access speeds in the United States are 10 Mbps, according to Akamai. Average speeds are 32 Mbps, according to Ookla. Another study shows that average Internet access speeds in the United Kingdom and United States are equivalent, in fact.  

The quality of the upstream path and in-home network have some impact, at all speed ranges, but at a dramatically lower level as speeds climb above 10 Mbps.  

One finding was surprising. The Ofcom tests of end-to-end user experience suggest that web browsing is significantly affected by upstream and downstream access speeds, the home network and the Internet service provider’s network interconnection policies.

Both the upstream and downstream speeds affect user experience of streaming video, while voice experience is, relatively speaking, barely affected.

Those are important findings. The quality of the broadband experience is not solely dependent on
access speed. In-home wiring (including Wi-Fi performance) and peering arrangements etween internet service providers can also be important.

“Indeed, for connections with a download speed greater than 10 Mbps, access speed appears to become less significant than these other factors,” Ofcom says.

At connection speeds above the range of 5 Mbps to 10 Mbps, though, the relationship breaks down and broadband connection speed is no longer an important determinant of performance, Ofcom says.

The important observation is that elements of the end-to-end value chain--other than access speed--now are becoming greater bottlenecks.

Wednesday, December 10, 2014

BT Acquisition of O2 Seen as Imminent

BT is expected to buy O2, getting back into the mobile business in a big way, “before the end of 2014,” U.K. financial site “This is Money” reports. That would give BT 24 million mobile customers, create the ability to provide a quadruple pay bundle, and likely trigger a reshuffling of assets in the U.K. communications market as well.

“Virtually all operators today believe that their future lies in a quad play, a supposedly new buying behavior expected to consolidate in future: In the operators’ view, customers will buy their fixed telephony, broadband, mobile, and TV services all from the same company and stay with the same provider for longer,” says Dario Talmesio, Ovum analyst.

“Mobile businesses are looking to move into broadband because their revenues are in decline,” many would note. “While revenues for the broadband industry have grown by four percent to five percent a year over the last two years, mobile revenue continues to decline by three percent a year,  according to media consultancy Enders Analysis.

Still, a successful acquisition of O2 by BT would put BT into contention near the top of the U.K. mobile market.

EE, the joint venture between Orange and Deutsche Telekom, has 33 percent share of mobile revenues in the U.K. market.

O2 has 26 percent market share in the U.K. market, as does Vodafone, with 26 percent market share by reported service revenue in the United Kingdom.

Hutchison Whampoa’s Three has 12 percent market share and and Virgin Mobile has three percent market share.

Growing interest in quadruple play retail offers is driven, fundamentally, by the contraction of mobile revenue, the shrinking of fixed network voice revenue and the degree of competition in the U.K. market.

Simply stated, contestants look to sell more products to a perhaps-smaller number of customers, boosting gross revenue and slicing customer churn by selling four products instead of one or two. ee

It isn’t rocket science.

Tuesday, December 9, 2014

Despite Gigabit Hype, Internet Access Speed is Not the U.K. or U.S. Experience Bottleneck

Bragging rights and therefore marketing advantages are believed to accrue to Internet service providers with the highest perceived Internet access speeds.

Some--if not most--of that marketing hype apparently is misplaced, a new study by Ofcom, the U.K. communications regulator, might suggest.

The study found that “access speed” matters substantially at downstream speeds of 5 Mbps and lower. In other words, “speed matters” for user experience when overall access speed is low.

For downstream speeds of 5 Mbps to 10 Mbps, the downstream speed matters somewhat.

But at 10 Mbps or faster speeds, the actual downstream speed has negligible to no impact on
end user experience. Since the average downstream speed in the United Kingdom now is about 23 Mbps, higher speeds--whatever the perceived marketing advantages--have scant impact on end user application experience. Some 85 percent of U.K. fixed network Internet access customers have service at 10 Mbps or faster.

Investing too much in high speed access is, as a business issue as investing too little. The important insight is that it is perception that now matters most in the United Kingdom and United States, not the actual threshold required to provide reasonable end user experience of Internet applications such as web browsing and streaming video.

Average access speeds in the United States are 10 Mbps, according to Akamai. Average speeds are 32 Mbps, according to Ookla. Another study shows that average Internet access speeds in the United Kingdom and United States are equivalent, in fact.  

The quality of the upstream path and in-home network have some impact, at all speed ranges, but at a dramatically lower level as speeds climb above 10 Mbps.  

One finding was surprising. The Ofcom tests of end-to-end user experience suggest that web browsing is significantly affected by upstream and downstream access speeds, the home network and the Internet service provider’s network interconnection policies.

Both the upstream and downstream speeds affect user experience of streaming video, while voice experience is, relatively speaking, barely affected.

Those are important findings. The quality of the broadband experience is not solely dependent on
access speed. In-home wiring (including Wi-Fi performance) and peering arrangements etween internet service providers can also be important.

“Indeed, for connections with a download speed greater than 10Mbps, access speed appears to become less significant than these other factors,” Ofcom says.

At connection speeds above the range of 5 Mbps to 10Mbps, though, the relationship breaks down and broadband connection speed is no longer an important determinant of performance, Ofcom says.

The important observation is that elements of the end-to-end value chain--other than access speed--now are becoming greater bottlenecks.

AT&T and Verizon both Expect Higher Mobile Churn in 4th Quarter 2014

AT&T now says it expects to report higher churn rates in the fourth quarter of 2014, the result of the marketing war which now has ensnared even Verizon Communications, which has said it would remain above the fray.


Verizon Communications announced “strong momentum for wireless customer growth in the fourth-quarter 2014” and “very strong customer demand” for 4G smartphones and tablets on its “More Everything” shared data plans for the fourth quarter of 2014, at the same time it is warning that the company expects pressure on earnings and profit margins.


“The fourth-quarter impacts of its promotional offers, together with the strong customer volumes this quarter, will put short-term pressure on its wireless segment EBITDA and EBITDA service margin (non-GAAP, based on earnings before interest, taxes, depreciation and amortization) as well as its consolidated EBITDA margin (non-GAAP) and earnings per share,” Verizon said.


Such is the competitive environment that Verizon expects that financial pressure even as it expects higher retail postpaid gross additions, both sequentially and year over year.


About 75 percent of Verizon’s quarterly upgrades were “strategic or high-quality,” meaning they represented upgrades from a basic phone, a 3G smartphone or represented an upgrade by a high-value customer.


The percentage of customers choosing the “Verizon Edge” equipment-installment plan option in the fourth quarter of 2014 currently is 24 percent, double the rate of third-quarter 2014, when 12 percent of total phone activations. used the equipment installment plan.


“Total retail postpaid disconnects are trending higher both sequentially and year over year in this highly competitive and promotion-filled fourth quarter,” Verizon said.

One day later, AT&T says its churn also will be higher in the fourth quarter.

Monday, December 8, 2014

Verizon Expects Strong LTE Upgrades, Strong Net New Subscriber Growth, and a Hit to Earnings and Profit Margins

When is strong net new mobile subscriber growth, high take rates for Long Term Evolution fourth generation service and high conversion rates of the best existing customers to 4G service  a problem?


When a highly-competitive environment means there is lots of promotional activity, even by a firm that eschews promotions.


So it is that Verizon Communications announced “strong momentum for wireless customer growth in the fourth-quarter 2014” and “very strong customer demand” for 4G smartphones and tablets on its “More Everything” shared data plans for the fourth quarter of 2014, at the same time it is warning that the company expects pressure on earnings and profit margins.


“The fourth-quarter impacts of its promotional offers, together with the strong customer volumes this quarter, will put short-term pressure on its wireless segment EBITDA and EBITDA service margin (non-GAAP, based on earnings before interest, taxes, depreciation and amortization) as well as its consolidated EBITDA margin (non-GAAP) and earnings per share,” Verizon said.


Such is the competitive environment that Verizon expects that financial pressure even as it expects higher retail postpaid gross additions, both sequentially and year over year.


About 75 percent of Verizon’s quarterly upgrades were “strategic or high-quality,” meaning they represented upgrades from a basic phone, a 3G smartphone or represented an upgrade by a high-value customer.


The percentage of customers choosing the “Verizon Edge” equipment-installment plan option in the fourth quarter of 2014 currently is 24 percent, double the rate of third-quarter 2014, when 12 percent of total phone activations. used the equipment installment plan.

“Total retail postpaid disconnects are trending higher both sequentially and year over year in this highly competitive and promotion-filled fourth quarter,” Verizon said.

40% of Enterprises Will Go "Wi-Fi First" by 2020

Ethernet cabling remains the mainstay for enterprise data connections, but Wi-Fi is becoming a “first choice” of employees, for a number of reasons.

By 2018, 40 percent of enterprises will specify Wi-Fi as the default connection for non-mobile devices, such as desktops, desk phones, projectors, conference room, Gartner analysts now predict.

User reliance on mobility is key. In the emerging economies, users are adopting smartphones as their exclusive mobile devices while in developed economies, multi-device households are becoming the norm, with tablets growing at the fastest rate of any computing device, Gartner says.

Gartner predicts that, by 2018, more than 50 percent of users will go to a tablet or smartphone first for all online activities.

“The use pattern that has emerged for nearly all consumers, based on device accessibility, is the smartphone first as a device that is carried when mobile, followed by the tablet that is used for longer sessions, with the PC increasingly reserved for more-complex tasks,” said Van Baker, research vice president.

Given that consumer shift to untethered and mobile devices, Wi-Fi makes more sense.

“Ethernet cabling has been the mainstay of the business workspace connectivity since the beginning of networking. However, as smartphones, laptops, tablets and other consumer devices have multiplied, the consumer space has largely converted to a wireless-first world," said Ken Dulaney, vice president and distinguished analyst at Gartner. “we expect many organizations to shift to a wireless-by-default and a wired-by-exception model.”

Globally, the “mobile first” trend will be fueled by the ability to buy a smartphone for less than US$100, by about 2020.

By 2018, 78 percent of global smartphone sales will come from developing economies, as well.

By 2018, Gartner expects a $78 average selling price for a “basic phone” to be $78, while a simpler “utility phone” costs $25.

Some low-cost smartphones are expected to reach approximately $35 (unsubsidized) by the end of  2014, compared with the $50 entry-level smartphones seen in 2013.

Friday, December 5, 2014

Implications of "Pervasive" High Speed Access

Though some might focus on findings related to typical high speed access speeds, use of smartphones, cost per delivered megabyte or investment in next generation networks, some might say the key strategic point raised in a new study of G7 high speed access is the movement to “pervasive” access.

And that point is that high speed access evolves over time to a stage where “most end-user connections are wireless, at speeds produced only by wired systems in earlier stages,” the study argues.

Note the prediction: untethered access speeds eventually approach wired network speeds. That has potential implications for the ability to substitute mobile or untethered access for fixed access, as well as for the strategic value of all fixed networks.

Obviously, the speed match will be highest for optical-to-Wi-Fi connections than for optical-to-mobile connections, partly because of distance effects, partly because of spectrum constraints and partly because for reasons of network architecture.

Basically, bandwidth and speed are inversely related, so short Wi-Fi links will supply faster access than mobile macrocells spanning distances of miles. Additionally, local Wi-Fi has access to more spectrum than any single mobile service provider.

Also, mobile networks reuse spectrum in ways that mean all the available spectrum cannot be used at any single location. Wi-Fi networks can use all available spectrum, at every location, subject to interference issues.

The shift to “pervasive” networking also is significant because it points to the future evolution of the high speed access business: from fixed to mobile and untethered, with a key role played by the fixed infrastructure as a way of extending core network access close to network edges, allowing a high degree of untethered access.

Prior high speed access networks featured a high-performance wide area network optical core, a regional distribution network and then a mid-speed copper access network extending core network transport for distances of perhaps 3.5 miles, in suburban areas.

Increasingly, the optical network core extends deep into the metro-area distribution network, and in the case of optical fiber access networks, to a neighborhood or single location, with copper or radio distribution on a local basis.

That is the case for “fiber to neighborhood” networks that use optical media to an area of a score, or perhaps several hundred homes, with copper media for a kilometer to perhaps a mile, and then local distribution typically using Wi-Fi within a location.

Mobile networks have been built with optical cores connecting to microwave, fiber or copper distribution, and then radio access for towers reaching a few to several miles. More heterogeneous networks now are appearing in dense urban areas, in some cases using small localized cells that cover small areas.

Fiber to home networks extend optical media to actual end user locations, with local distribution typically using Wi-Fi rather than the older Ethernet cable interfaces.  

The study argues that high speed access develops in three distinct phases. At the “basic” stage  
wired telephone, cellular telephone, and cable TV networks are coupled with broadband electronics to provide a basic level of connectivity 10 to 100 times greater than voice networks.

At an “advanced” stage, after more optical fiber deployment, better modulation techniques, more sensitive radios, better signal compression and signal processing, as well as additional spectrum allocation for untethered and mobile use, speed improves another 10 to
100 times.

At the “pervasive” stage, most user connections are mobile or untethered and access speeds better approach fixed network speeds.

Beyond the matter of access speed for untethered and mobile devices, the "pervasive" access also points to expected changes in "fifth generation" mobile networks and application development. When access is pervasive, mobile devices will increasingly represent the way people use the Internet and applications.

And that suggests app development increasingly will revolve around "mobile" interfaces, form factors and input-output methods. Also, as increasingly is the case, apps will shift in the direction of location-specific, activity-aware and sensor-assisted app features.

AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...