Saturday, October 6, 2018

Service Providers Embrace Open Source

The global telecom industry has come a long ways from the days of proprietary platforms, solutions and network elements. Consider the heavy service provider membership in the Telecom Infra Project, an effort to develop open source platforms for communications and mobile networks.


A survey of 150 mostly-technical professionals from 100 companies finds 73 percent are “extremely” or “mostly” confident that open networking solutions can achieve the same level of performance as traditional networking solutions.

Some 59 percent of respondents say they currently are using open networking solutions and some 84 percent of those that are not, plan to do so within the next three years.

Still, technology immaturity is the biggest concern for 46 percent of respondents. The next closest concern, at 23 percent, was performance itself.

As you would expect, cost savings are the driver. Some 75 percent of respondents say cost savings are the expected outcome of deploying open networking solutions.

The survey of 150 networking professionals from 100 communications service providers globally, included 48 percent who  work for converged service providers owning both fixed and mobile networks.

Mobile service providers were 25 percent of the sample, while wireline, cable and satellite operators made up 22 percent. Some 57 percent are in North America; 17 percent from Europe and 14 percent from Asia.

Respondents from Central/South America comprised eight percent of the sample, and those from Middle East/ Africa represented four percent.

Survey respondents worked in technical roles. Nearly 25 percent work in engineering and 20 percent say they work in network design and planning. Some 19 percent work in network operations and 11 percent in research and development.

Friday, October 5, 2018

Where Could Blockchain Add Value in Communications or Media?

Will disintermediation be one of the ways blockchain ultimately has value in the “technology, media and telecom” (TMT) industry? Possibly. Disintermediation is the process of removing distributors from any supply chain. Think “over the top” and you get the concept. So anything that promises disintermediation could have big consequences in the TMT space.

In the case of blockchain, that disintermediation could be a positive, not a negative, for content owners or distributors, though. Think about the problem of authenticating users and subscribers; participants in any social media transaction or in any highly-distributed access services environment.

Consider the case of a mobile services provider that amalgamates access to multiple networks, including assets secured from two or more other underlying service providers. Think of Google Fi, which uses Wi-Fi, Sprint and T-Mobile US networks. In some future scenario, perhaps blockchain is used to authenticate users for access to each of the participating networks.

To be sure, there are other ways of doing so. The issue is whether blockchain might be easier or cheaper, eventually, perhaps for cross-border (international roaming) transactions, for example. International settlements always are seen as a value of blockchain, in terms of taking cost out of such transactions.

The idea is that blockchain could have value whenever databases must be kept or transactions completed. Communications and content arguably have lots of places where those two things happen.

Blockchain is a technology of more than average potential usefulness in the “technology, media and telecom” industry (or industries; it is hard to say which is the more-apt description), according to consultants at McKinsey. In fact, in most industries, blockchain might have both low feasibility and relatively-modest impact, the consultants say.

Essentially, blockchain offers the hope of “perfect audit history,” without fraud. That obviously has implications for the financial industry, or any situation where “trust” is essential. And since “money” is always based on trust, that matters.

But trust has become a bigger issue for social media and advertising as well, which is likely why blockchain could have relevance in the TMT space. Though blockchain is not foolproof, it arguably is more hardy than most other ways of using databases, as fraud generally requires a wide level of willingness to commit fraud (something over half of all connected computers are in on the attempt, McKinsey essentially argues).

Nor can blockchain check on the integrity of data that is input into the database. “All that the blockchain itself does is ensure the integrity of the individuals making the transaction, ensuring that you have the right combination of a public and private key,” McKinsey analysts note.




Blockchain: One View of What it Is, and Is Not

Blockchain is one of those concepts one hears about all the time (artificial intelligence and machine learning also), is likely destined to be important in the communications industry, but in ways that are not always intuitive, or necessarily visible to most practitioners.

It is rather akin to "electricity,"  "computing" or "cloud computing" or "open source" in that sense. 




Thursday, October 4, 2018

AT&T Builds 5G on 4G

It often is said that 5G builds on 4G, and that is correct. Consider AT&T, which is boosting 4G speeds as it launches 5G markets. AT&T plans to bring mobile 5G to 12 cities in 2018, reaching at least 19 cities in early 2019.

AT&T also has announced 99 new 5G Evolution markets, bringing the total number of such markets with these technologies to 239. 5G Evolution markets are locations where peak theoretical wireless speeds for capable devices are at least 400 megabits per second.

AT&T says 5G Evolution will be available  in over 400 markets by the end of 2018. In the first half of 2019 AT&T plans to offer nationwide coverage, making 5G Evolution available to over 200 million people.

The other technology AT&T is deploying is LTE-LAA, which boosts peak theoretical wireless speed for capable devices to a gigabit per second. LTE-LAA is now live in parts of 20 cities with plans to reach at least 24 cities in 2018.

In terms of devices, AT&T offers 13 devices capable of accessing both 5G Evolution and LTE-LAA network technologies. The devices include: LG V30 and LG V35 ThinQ, Motorola Z2 Force Edition, Netgear Nighthawk Mobile Router, Samsung Galaxy S8 and Galaxy S9 series devices and others.

5G_map_cities.jpg

Saturday, September 29, 2018

Could Edge Computing Change Smartphone Design and Cost?

Edge computing is almost always touted as a necessity in the 5G era to support ultra-low-latency services, the typical examples being support for autonomous vehicles, remote surgery or even prosaic requirements such as supporting channel changes on video screens supporting ultra-high-definition TV (4k, 8K, virtual reality).

But are there are other possibilities? Consider the advent of the Chromebook, a “PC” that essentially conducts all computing activities at a remote cloud data center. The advantage is lower-cost customer premises equipment (CPE).

Sure, one needs a screen, power supply, keyboard and some amount of on-board memory and processing. But not so much. It often is said, with a good measure of truth, that a Chromebook is a device supporting a browser, and not much more.

So can edge computing support a similar approach to the design of smartphones, essentially creating a device that resembles earlier efforts to create network-centric computing devices? Maybe, some think.

Could edge computing create new opportunities for access providers supplying phone services? AT&T believes that could happen.

AT&T plans to build thousands of small edge computing data centers in central offices and other locations across the United States. So could a big edge computing network affect mobile phone design as much as cloud computing has affected the design and use of computing devices? AT&T’s Mazin Gilbert, VP, thinks that is a possibility.

Edge computing could create the conditions for really cheap smartphones. “Can my $1,000 mobile phone be $10 or $20 dollars, where all the intelligence is really sitting at the edge?,” Gilbert asks. “It’s absolutely possible.”

That obviously would dramatically reduce barriers to smartphone use by everyone, while providing some means of differentiation for access services provided by AT&T. Both trends would provide more reasons for consumers or businesses to use the AT&T network, instead of rival networks.

It has been decades since tier-one telcos actually had a significant role in customer premises equipment business. Back in the monopoly era, telcos actually made and sold the phone devices people used. In fact, it was illegal to use any phone not manufactured by the service provider.

In the competitive era, service providers have been irrelevant as suppliers of CPE, as that role was ceded to device suppliers active in the consumer electronics space.

Edge computing could change those assumptions. Perhaps a firm such as AT&T licenses the building of cheap smartphones that rely extensively on edge computing and are designed to work on AT&T’s network.

As always, that approach will start out as a “useful for many people” but not a “full and complete substitute” for standard smartphones able to work globally. But not every customer requires global roaming. For most customers, coverage most places in the United States will work.

As any Chromebook user will attest, the “connect to Wi-Fi or you cannot do too much” approach is not perfect. You cannot “compute” anywhere (except to conduct offline transactions or activities). But it works, especially if one has the ability to tether to a smartphone.

Something like that could be possible once edge computing is fully built out.

U.S. Device Adoption is Near Saturation

Use of communications-dependent devices obviously has direct implications for communications service demand. So it matters that U.S. consumers now are reaching--or already have reached--saturation levels of device use.

Not to belabor the point, but device and account saturation strongly suggests that demand for new services and apps has to be created, beyond current levels of functionality for devices and connections.

That is one reason why many believe 5G is going to be different than all prior generations of mobile platforms. It will be the first platform where brand-new value, and therefore new revenue opportunities, will be created by enterprises. Consumer demand for phone functions and connecting other devices is fairly well saturated.

source: Pew Research Center

Thursday, September 27, 2018

Why Nobody Releases Gigabit Take Rates, Yet

Not one U.S. internet service provider publicizes the take rates it gets for gigabit internet access. Historically, no ISPs have done so for their fastest tiers of service, either. The reason, as you might suspect, is that it is highly likely take rates for such tiers of service are rather modest, and tend to be purchased by businesses rather than consumers.

Eventually that could change, but only when purchases of gigabit access service is the mid-tier offer.

Back in the days when cable TV operators first were rolling out consumer Internet access at speeds of 100 Mbps, it was virtually impossible to get subscriber numbers from any of the providers, largely because take rates were low.

In the United Kingdom, then planning on upgrading consumer Internet access speeds to “superfast” 30 Mbps, officials complained about low demand. In fact, demand for 40 Mbps was less than expected.

So “gigabit” internet access remains mostly a marketing platform, not an indicator of what services people actually buy, when they have access to gigabit services.

Value versus price is the likely reason for consumer behavior. “Value (performance versus price)” seems to be evaluated as best in the mid ranges of internet access service, not the “fastest” grades of service. Nor is that an unusual situation for most product categories.

In Australia, in 2016, for example, perhaps 15 percent of consumers purchased the then-fastest speed tier of 100 Mbps. Some 47 percent bought the mid-range service at 25 Mbps. Some 33 percent of buyers were content with service at the slowest speed of 12 Mbps.

Likewise, even where fiber-to-home connections are available, that does not mean most consumers will buy such service, if other options also are available. Data from New Zealand suggests take rates might be 33 percent where FTTH is sold.

Price has much to do with those choices, as do perceptions of value. The safest assumption is that multi-user households are most likely to buy faster tiers of service, reasoning that the connection bandwidth has to be shared by all members of the household.

Also, since there always is a direct relationship between purchases of internet access generally with higher incomes, we should not be surprised if cost-conscious consumers opt for less-expensive packages, while higher-income consumers are most likely to buy the most-expensive packages, which also are the fastest.

The takeaway is that most consumers buy the mid-tier offers. According to Federal Communications Commission data, in 2015 the most popular advertised speed plans purchased by consumers tend to range about 100 Mbps for cable providers.

AT&T U-verse plans generally were in the 45 Mbps range in 2015, while DSL speeds (all-copper access) were quite low, in comparison. Verizon FiOS speeds were generally in the 80-Mbps range.

Over time, as speeds increase, consumers have tended to keep upgrading. But they have generally tended to buy the mid-tier services. That is what AT&T has found as it increases the top speeds available.  CenturyLink also found that to be the case.  

In 2010, for example, about 40 percent of U.S. consumers were buying Internet access at about 6 Mbps. You might wonder why, but the answer is simple. In 2010, the 6-Mbps service offered what consumers then considered the best value for the money paid.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...