Saturday, July 16, 2016

How to Choose a Co-Location Services Provider

There is a simple why businesses of all sizes continue to invest more in data storage and computing. Few, if any, modern businesses can operate without information technology.

According to Cisco, data center workloads will more than double from 2014 to 2019, for example.

When large businesses reach about 75 percent of computing and data storage capacity, they begin to evaluate alternatives for adding capability.

Most often, the choices involve most often involve investing in colocation capacity or buying cloud storage, instead of building new owned computing facilities, according to 451 Research.

Why Colocate?

Colocation makes sense when owned hardware is nearing end of useful life. In other cases, decisions to “lease, rather than build” are driven by staff resources inadequate to manage the upgrades.

Also, colocation and outsourcing make sense  when firms want to reduce costs or security and compliance chores.

That is why the colocation market is forecast to grow at more than 12 percent annual rates between 2015 and 2019, according to researchers at Technavio.

That is driving the $23 billion annual revenues colocation business, 45 percent of which is activity in North America.  

Even firms presently using colocation can add incremental resources affordably by using cloud computing in a hybrid mode, as an alternative to buying and managing additional hardware.

Firms can buy computing or storage infrastructure “as a service.” In an IaaS model, a third-party provider hosts hardware, software, servers, storage and other infrastructure components on behalf of its users.

In other cases firms might choose to use public cloud, private cloud or “hypervisors as a service” (use of virtual machines).

In a public cloud scenario, businesses essentially rent computing cycles or storage on servers operated on a shared basis.

In a private cloud environment, businesses buy dedicated use of resources not shared with other customers.

“Hypervisor” is a method of efficiently running any application on “virtual machines” without having physical copies of software loaded to support operating systems.

What Questions Should You Ask, When Buying Colocation Services?

Colocation always involves space and power. Any colocation facility should have room to accommodate not only your present requirements, but future growth. Also, stability and reliability of power systems are essential for your own equipment and the data center overall, to operate servers and keep them cool.

But technical support also is crucial. Look for a provider with a seasoned staff and proven credentials. The staff has to be able to diagnose and fix any potential issues that could compromise the performance and security of your equipment and data, quickly and cost effectively.
Storage-as-a-service

Every business requires a convenient way to store key data. Storage as a service allows businesses to save and retrieve business data reliably and affordably, without manual intervention by company staff.

Using a SaaS service, a customer can specify what data must be stored, how often it should be saved and how fast it can be retrieved in the event of any data loss by primary systems.

Service level agreements can help assure that data is securely backed up and quickly restored if necessary.
Cloud Storage

Cloud storage is a service that maintains, manages and backs up key business data remotely, while available to users over a network  that typically is the Internet.

As with all other cloud services, businesses can buy public, private or hybrid service. Generally, public cloud storage is best for unstructured data. Private cloud arguably is more appropriate when businesses need more customization and control. Hybrid cloud might be best when a business wants access to actively used and structured data in a private cloud, while archival data can be kept in a public cloud.

Disaster Recovery-as-a-Service

DRaaS enables the full replication and backup of all business data and applications. It allows an organization that has experienced major or total failure of primary systems to continue with daily business processes while the primary system undergoes repair.

DRaaS also allows these applications to run on virtual machines (VM) at any time, even without a real disaster. That is useful if a business wants a “sandbox” to prototype or test major new applications without exposing or interfering with  primary systems

Direct connect to cloud service providers

Your data center connections to other partners must be reliable, safe, and fast. Dedicated connections are faster and feature less latency than Internet connections. That is important for important performance-sensitive applications such as video or voice communications and “virtual desktop” apps.

Direction connections minimize your disaster recovery response times, and allow large data transfers.

Backups

Backup as a service provides an automated and managed way to preserve key business data, Cloud backup, also known as online backup, is a service that automatically, on a fixed schedule, collects, compresses, encrypts and transfers data to a remote storage location.

Hypervisors-as-a-Service

A hypervisor, also called a virtual machine manager, is a program that allows multiple operating systems to share a single hardware host. Each operating system appears to have the host's processor, memory, and other resources all to itself.

Hypervisor as a service allows a business to buy that functionality without having to manage the process, maintain and update the virtual machines.

Every colocation service begins with space and power, but companies need to future-proof their decisions.

A colocation provider should have the technical and human ecosystems to provide direct access and cross-connects to a number of managed services providers and potential customers, while
Supporting and monitoring a business information technology environment 24x7x365, all in one facility.

Friday, July 15, 2016

Is Special Access Market Competitive, or Not?

According to the U.S. Federal Communications Commission, the total market for U.S. special access services is roughly $40 billion annually. The FCC believes the market is insufficiently competitive.

But observers note that the FCC itself reported U.S. telcos had 92 percent market share in special access in 1980. Telco share had dropped to 39 percent in 2013. In that year, TDM-based services represented $25 billion of the total market, while incumbent telcos accounted for about $16 billion of the TDM market, according to the FCC’s own data.

In some markets, 39 percent might well represent “dominance.” Whether that is the case in the special access market is the issue.

With the caveat that usage or bandwidth is not the same thing as revenue, Ethernet access provided by a wide range of competitors now represents market growth, while the legacy T1 and DS3 market declines.  

In fact, cable TV companies are big suppliers of Ethernet access and other access services to business buyers.

AT&T has argued that “facilities-based competitors are serving 95 percent of all MSA census blocks (on average, about one seventh of a square mile in an MSA) nationally where there is demand for special access services, and, second, that 99 percent of all business establishments are in those census blocks.”

For the first time ever, the U.S. Federal Communications Commission also is seeking to extend some special access obligations to cable TV networks for the first time.

And some argue that it is impossible to fully do a data-driven analysis because the FCC is not releasing the full results of its earlier survey of facilities, an exercise intended to provide some rationale for assessing the existence of facilities-based competition.

Some might find the emphasis on extending regulation to a declining service curious. AT&T’s access lines have declined by almost 65 percent since 2009.

A Historic Shift from Scarcity to Abundance has Been Underway for At Least 2 Decades

Though it perhaps is surprising, very little direct discussion of the role of scarcity in the telecommunications business happens these days. That is manifestly not because people are unaware of its importance.

Contestants are very much aware of the role “scarcity” plays in the access business. In fact, attempts to maintain scarcity are a foundational part of strategy for some contestants, while efforts to end scarcity and create abundance likewise underpin the strategies undertaken by attackers.

The reasons are drop-dead simple: scarcity creates higher profit margins and higher revenue, as is true in any market. Abundance lowers profit margins and gross revenue, in any market.

Some rightly would argue that a ubiquitous access network (mobile, cable TV, fiber to home or copper to the home) is expensive, limited the number of providers that can exist in any market.

The point is that, with the advent of an era of abundance, the barriers are going to fall, and fall substantially.

So the possible “bad news” for some access providers is that the historic scarcity of resources in the access network is going to be replaced by abundance. The “good news” for app providers is that access capacity is going to be less and less a barrier to their business models.

To be clear, the end of the age of scarcity, and the start of the era of abundance, is coming, for the bandwidth portions of telecommunications business, and will force dramatic rethinking of business models.

As access services drive less revenue volume and produce lower profits, access providers will move into other parts of the Internet ecosystem, just as app providers are moving into the access and device portions of the ecosystem.

The trend is not actually so new. In fact, abundance has been approaching for decades, in part because of advances in use of spectrum, the impact of Moore’s Law and competition itself.

The implications could not be more profound. For more than a century, “scarcity” has been the fundamental reality of the industry and the business.

Networks were expensive, time-consuming and bandwidth limited.

In some ways, scarcity still drives the equity value of fixed and mobile networks. Fixed access networks are terribly expensive to build and operate, which is why there are so few of them in any market.

Advances are happening, but the “rule of a few” still holds, as what is scarce are enough customers to support the building and operation of a ubiquitous fixed access network in the face of two or more other providers.

But scarcity and abundance are starting to coexist. Moore’s Law helps. Better signal processing and antenna arrays help. Unlicensed spectrum and Wi-Fi also help. Optical fiber helps, even if some in the recent past have argued that scarcity and pricing power would return to the access business when optical fiber becomes ubiquitous.

Fixed wireless helps. Spectrum sharing helps as well.  

But much more is coming. The U.S. Federal Communications Commission is moving to make  available an extraordinary amount of new spectrum--including seven gigaHertz (7 GHz)  worth of unlicensed spectrum, in the millimeter wave bands, and a total of 11 GHz, including 3.85 GHz of licensed spectrum, in a first wave.

Nor is that all. The Commission also adopted a Further Notice of Proposed Rulemaking, which seeks comment on  rules adding another 18 GHz of spectrum encompassing eight additional high-frequency bands, as well as spectrum sharing for the 37 GHz to 37.6 GHz band.

Keep in mind that the new allocations represent many times more spectrum than all other existing spectrum now available for mobile and wireless communications in the U.S. market. Just how much more depends on one’s assumptions about coding techniques and modulation.

But it is possible the new spectrum will represent an order of magnitude or two orders of magnitude more communications spectrum than presently is available for mobile and wireless communications purposes.

Abundance will transform business models. Incumbents who built their businesses on scarcity will have to rework those models. App providers whose businesses are built on the assumption of abundance will flourish, at least potentially.

National Science Foundation to Spend $400 Million on Next-Generation Wireless (Spectrum Sharing and Millimeter Wave Among The Technologies)

The U.S. National Science Foundation will spend more than US$400 million over the next seven years to fund next-generation wireless research in an effort to bring super-fast mobile service to the country.


Support for millimeter wave technology and spectrum sharing are among the areas the NSF will target. Both subjects are among the ssues to be discussed by speakers at Spectrum Futures in Singapore, 19-21 October, 2016.

U.S. officials hope the investments will speed up the county's move to next-generation 5G mobile service, potentially offering speeds of 10Gbps, and allow for a rapid expansion of the internet of things.


The next-generation mobile services will enable self-driving cars, an "always on" IoT, smart cities, new virtual reality offerings, and video to aid police, firefighters, and emergency medical responders.

10 Million IoT Developers by 2020?

Millions of new developers are going to be working on Internet of Things applications, leading a growing number of potential ecosystem participants to prep for potential new roles in the IoT business, beginning with developer support and platforms for industrial Internet of Things, for example.

AT&T and IBM  now are working together to help developers--and the enterprises supporting them--create apps related to Internet of Things (IoT).

Separately, GE and Microsoft are working on an IoT platform support initiative of their own, working to make GE’s Predix platform for the Industrial Internet available on the Microsoft Azure cloud for industrial businesses.

The move marks the first step in a broad strategic collaboration between the two companies, which will allow customers around the world to capture intelligence from their industrial assets and take advantage of Microsoft’s enterprise cloud applications.

According to the VisionMobile, nearly 10 million developers will be active in IoT by 2020, doubling the estimated five million working in the IoT area today.

IBM and AT&T are expanding their investment in open-source based tools, such as Node-Red, and open standards like MQTT, all essential for creating IoT solutions.

In addition, IBM's Watson cognitive computing and AT&T's IoT Platforms like Flow Designer and M2X, and access to its global network, will be pushed as development and execution tools.  

Thursday, July 14, 2016

The Age of Scarcity is Ending, the Era of Abundance is Beginning

The end of the age of scarcity, and the start of the era of abundance, is coming, for the telecommunications business. The implications could not be more profound. For more than a century, “scarcity” has been the fundamental reality of the industry and the business.

Networks were expensive, time-consuming and bandwidth limited.

In some ways, scarcity still drives the equity value of fixed and mobile networks. Fixed access networks are terribly expensive to build and operate, which is why there are so few of them in any market.

Advances are happening, but the “rule of a few” still holds, as what is scarce are enough customers to support the building and operation of a ubiquitous fixed access network in the face of two or more other providers.

But scarcity and abundance are starting to coexist. Moore’s Law helps. Better signal processing and antenna arrays help. Unlicensed spectrum and Wi-Fi also help. Optical fiber helps. Fixed wireless helps.

But much more is coming. The U.S. Federal Communications Commission is moving to make  available an extraordinary amount of new spectrum--including seven gigaHertz (7 GHz)  worth of unlicensed spectrum, in the millimeter wave bands, and a total of 11 GHz, including 3.85 GHz of licensed spectrum, in a first wave.

Nor is that all. The Commission also adopted a Further Notice of Proposed Rulemaking, which seeks comment on  rules adding another 18 GHz of spectrum encompassing eight additional high-frequency bands, as well as spectrum sharing for the 37 GHz to 37.6 GHz band.

Keep in mind that the new allocations represent many times more spectrum than all other existing spectrum now available for mobile and wireless communications in the U.S. market. Just how much more depends on one’s assumptions about coding techniques and modulation.

But it is possible the new spectrum will represent an order of magnitude or two orders of magnitude more communications spectrum than presently is available for mobile and wireless communications purposes.

Abundance will transform business models. Incumbents who built their businesses on scarcity will have to rework those models. App providers whose businesses are built on the assumption of abundance will flourish, at least potentially.

FCC Says Telcos No Longer "Dominant," in One Sense

In something of milestone ruling, the U.S. Federal Communications Commission has said, as part of a decision on decommissioning of old time division multiplex networks, that local voice providers “are no longer dominant in the market for connecting local callers to long-distance networks.

That still is not a full fuling that local telcos are no longer dominantfor other purposes, but it is a step in what some would say is the right direction. It seems only a matter of time before the whole notion of “dominant providers,” and the special restraints placed on them, are reduced.
Cable TV companies arguably already have displaced telcos as the “dominant” suppliers of high speed Internet access while mobile service has displaced landlines as the “dominant” way people use voice and messaging.

“The increasing popularity of mobile wireless, cable Voice over IP services and regulatory changes combined to erode the dominant position of local carriers in the market for interstate switched access,” the FCC noted.

The other helpful decision, from a telecom industry viewpoint, is that the new rules streamline the process of ending  legacy TDM-based voice service and supplying such services on a next-generation network, in as few as 30 days.

Applicants must show that:
  • network performance, reliability and coverage is substantially unchanged for customers
  • access to 911, cybersecurity and access for people with disabilities meets current rules and standards is proven
  • Compatibility with a defined list of legacy services still popular with consumers and small businesses, including home security systems, medical monitoring devices, credit card readers and fax machines, subject to sunset in 2025, is assured.

Ironically, some would note, even as pressure mounts for building of new next-generation optical fiber networks, some still insist that legacy TDM networks--and legacy services--be maintained as well.

To the extent that operating two ubiquitous networks, where one network has fewer and fewer customers, raises operating costs and wastes capital investments, the older networks should be retired faster, not slower.

Yes, there are some consumer effects when a legacy network is terminated. But we have experience wiith such things. First generation netwoerks were shut off in favor of second generation networks.

Those 2G networks, in turn, will be shut off in favor of 3G. Consumers need time to migrate, but there alwys is some disruption, at the margin. Such disruption is no reason to delay the transition process any more than absolutely necessary.

FCC to Release 11 GHZ of New Mobile and Wireless Spectrum; Another 18 GHz is Coming

In an unprecedented move, the U.S. Federal Communications Commission is making available an extraordinary amount of new spectrum--including seven gigaHertz (7 GHz)  worth of unlicensed spectrum, in the millimeter wave bands, and a total of 11 GHz, including 3.85 GHz of licensed spectrum, in a first wave.

As a practical matter (because the higher frequencies can carry much more data than lower frequencies), the new millimeter unlicensed spectrum arguably represents more access spectrum than presently allocated for all other mobile, Wi-Fi and fixed wireless purposes.

Nor is that all. The Commission also adopted a Further Notice of Proposed Rulemaking, which seeks comment on  rules adding another 18 GHz of spectrum encompassing eight additional high-frequency bands, as well as spectrum sharing for the 37 GHz to 37.6 GHz band.

That means a total of about 29 GHZ of new communications spectrum, available for mobile and untethered use, will eventually be coming to market.

The new rules create a new “Upper Microwave Flexible Use” service in the 28 GHz (27.5-28.35 GHz), 37 GHz (37-38.6 GHz), and 39 GHz (38.6-40 GHz) bands, and a new unlicensed band at 64 GHz to 71 GHz.

Altogether, that will make nearly 11 GHz of high-frequency spectrum for flexible, mobile and fixed use wireless broadband, including 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum.

These rules will support unlicensed, licensed and shared access. That, and the fact that so much new spectrum is being made available, will have business model implications for quite a number of ecosystem participants. To the extent that spectrum scarcity is a source of business advantage, or a barrier to competitor entry, that will be less of an issue.

The availability of so much new unlicensed spectrum should make possible new business models for Internet service providers, who will not have to pay for spectrum.

The millimeter era is coming, and it will upend current business models and market structures.

Linear TV Really is a Legacy Service with Declining Fortunes, Study Suggests

A decade ago, it was an open question whether younger consumers who did not favor linear TV would adopt the habit as they got older and had families. The other question was whether consumers would eventually buy linear TV as their incomes increased.


It appears those hopes are not entirely well founded. Despite the presence of children and higher incomes, a significant percentage of younger consumers simply do not want to buy the linear TV product. So that makes linear subscription TV a legacy services with declining fortunes.


Cord cutters/have an average income of US$59,000, with 47 percent buying an subscription “on demand” service and 63 percent using over the top streaming services.


Some 37 percent of these households have children in the household.


Cord-nevers (consumers who never have purchased linear TV) have an average income of US$47,000, with 30 percent buying some form of on-demand subscription TV, while 35 percent use OTT video sources.


About 26 percent of those “cord-never” homes have children in them.


Some 17 percent of U.S. TV homes now rely on over-the-air reception, up from 15 percent in 2015, while six percent exclusively use internet TV services such as Netflix, Amazon Prime, Hulu or YouTube and have no broadcast or linear TV subscription service at all.


The “streaming only” households increased from four percent of TV homes in 2015.


For TV homes with a resident between 18 and 34 years old, 22 percent are using over-the-air reception only, while 13 percent are only watching internet TV.


Some 38 percent of households with an 18 to 34 year-old resident rely on some kind of alternative TV reception or video source, where 25 pe rcent of all TV homes do so.


Households with at least one resident over 50 are more likely to take cable or satellite services, with 82 percent buying linear TV subscriptions.


Low-income households, meaning those with household income of less than US$30,000 (€27,000), are more likely to rely on over-the-air reception, with 26 percent of these homes taking broadcast-only TV.


High-income households, meaning those earning over US$50,000 are more likely to take satellite TV than the average, with 27 percent of these homes taking a satellite service against a US average of 21 percent.


Overall, cord-cutters have an average income of US$59,000.


“Cord-nevers” (consumers who never have bought a subscription linear TV service)  have an average income of US$47,000.
source: GfK

Mobile Payments Still Have Not Crossed the Chasm

Even if rates of consumer embrace of many new gadgets is stunningly fast, adoption of any important new technology often takes far longer than many would forecast. That is likely to true for Internet of Things and mobile payments as well.

Generally, adoption “crosses the chasm” and becomes a mass market reality sometime after about 10 percent of people start using any new innovation. But how long it takes to reach that take-off stage is the issue. It can take decades. Some might point out that Internet of Things already has been growing for 16 years.

Bharti Airtel Uses an Old Tactic to Smooth Out New Network Demand

Here us one more example of a business practice some might see as a network neutrality infraction that actually does involve network management, but does so by offering incentives for off-peak use of a mobile network.

Consider a pricing plan Bharti Airtel has instituted for the reason of managing loads on its network.

As mobile and other service providers learned long ago with voice services, it makes sense to encourage customers to use the network when it is not heavily loaded.

That is why (and most readers are too young to remember it) long distance calling rates used to be set lower, or much lower, on weekends and evening hours. The idea was drop dead simple: most calling happened during working hours. There was comparatively little usage at other hours, and very little late at night.

Aside from saving consumers money, the lower prices for evening and late-night calling also distributed some of the network load.

Bharti Airtel now is doing the same thing for in-app content downloads, offering what amounts to lower prices for its prepaid mobile customers between 3 am and 5 am.

The offer is “50 percent data back” for all in-app content downloads, effectively cutting the price of usage in half.
   
But some might argue that is network neutrality infraction, as prices are set on a differential basis.

Some might argue it is efficient to shift network demand by offering incentives to users to do so. There seems to be no danger of predatory behavior, even if consumers take advantage of the offer.

That is the same sort of problem some see when zero rating is seen within a network neutrality framework. So long as antitrust is not a problem, firms should be free to experiment with retail packaging, promotions and pricing as they see fit, especially when there is a direct and tangible consumer benefit.

Potential predatory and unfair action by ISPs is a danger we have existing institutions policing. Beyond that, the problem with a broad application of the “best effort only” rules is that they actually do crimp supplier ability to innovate in ways that are demonstrably or arguably valuable for consumers.

One growing problem with network neutrality rules--even if one agrees fully with the objective of preventing antitrust activity by access providers and fostering an innovative and robust Internet--is that the rules (intentionally or not) do place obstacles in the path of some Internet service providers who want to experiment with business models, pricing, packaging and network management efforts.

Wednesday, July 13, 2016

New Smart Cities, Millimeter Wave Networks Speakers for Spectrum Futures

NSN Murty, Executive Director, Smart Cities, PwC, India, now is as a speaker on smart cities across South Asia, as part of the Spectrum Futures conference, to be held 19-21 October, 2016 in Singapore.


Also joining as a speaker: Prakash Kamtam, Advisor at Smart Cities India Foundation, India.

In addition to smart cities and internet of Things, Spectrum Futures will be addressing the new role of fixed wireless, millimeter wave networks for 5G and Internet access, use of shared spectrum, unlicensed spectrum, spectrum policy across the region, the role of venture capital for application partnerships, collaboration between ISPs and app developers.

Jonathan Brewer, Consulting Engineer at Telco2, will be speaking about use of millimeter waves for rural access across the region.

Other confirmed speakers include:

Chris Weasler, Facebook, Director of Global Connectivity
Greg Leon, Google Product Manager
Jay Fajardo, Founder, LaunchGarage
Praveen Sharma, Tata Communications, Head of Regulatory Affairs
Rajan S. Mathews, Cellular Operators Association of India, Director General
Shrinath V, Product Management & Design Thinking Consultant and Google Developer Expert
Mohamed El-Moghazi, National Telecom Regulatory Authority of Egypt, Director of Radio Spectrum Research and Studies
Camilo Alberto Jiménez Santofimio, Comisión de Regulación de Comunicaciones, Colombia, Senior Advisor
Reza Arefi, Intel Corporation, Director of Spectrum Strategy
Bob Horton, Horton Consulting, Director & Principal
Vern Fotheringham, V-Satcast, LLC, Executive Chairman
Josh Gordon, Red Pocket Mobile, President
Narendra K. Saini, Telecommunication Engineering Center (TEC), India, Chair - Smart Governance WG
Rajnesh Singh, Internet Society, Director, Asia-Pacific Regional Bureau
Muhammad Rashid Shafi, Multinet Pakistan (PVT.) LTD., CEO Global Business & Chief Strategy Officer
Devid Gubiani, Bolt Super 4G - PT First Media, CEO
Leo Sugandhi, spectrum frequency planning analyst for mobile services at Directorate of Spectrum Planning and Policy; Ministry of ICT, West Java Province, Indonesia
Basheerhamad Shadrach, Independent Consultant, Bill & Melinda Gates Foundation; Asia Coordinator, Alliance for Affordable Internet, New Delhi Area, India
Mr. Sushil Kumar, IoT standards and implementation, Telecommunication Engineering Center, Department of Telecom (DoT),  India. (will speak about IoT opportunities in India in several industries)

Syed Ismail Shah, Chairman, Pakistan Telecommunications Authority, Pakistan

Additional speakers are being added weekly. Here is a fact sheet about the event. Join us.

Peak PC in 2011, Peak Fixed Network Connections in 2000 or 2001

Fixed network connections in the United States peaked around 2000. It now appears that sales of PCs peaked in 2011.

20160713_PCs

In Access Business, Demand Won't Change Very Much; Supply Will. You Know What That Means

In addition to the possible issues (lower value, commodity status) caused by business model inversion, telco service providers also face further disruption on a range of other fronts.

We can assume high levels of competition for all current and future products and services that drive revenue, from traditional sources (other service providers) and new contestants (over the top app substitutes).

What comes next is likely additional forms of competition from non-traditional places, something that arguably can be seen in recent and expected developments in areas ranging from fifth generation (5G) standards to use of millimeter wave frequencies, use of unlicensed and shared spectrum, as well as moves to create more open source access platforms (Facebook OpenCellular, unmanned aerial vehicles, Google Project Loon).

Where in the past it was fairly easy to figure out “who the competition is,” it will be less easy to categorize in the future. Developments such as “network slicing,” for example, will allow app and service providers to buy attributes of networks that are optimized for the particular applications and business models those providers wish to offer.

In a functional sense, network slicing is a form of “wholesale access” to network features. It allows any enterprise or app provider to bundle network access and features with services and apps that drive the revenue model.

Spectrum sharing and unlicensed spectrum, plus new access platforms likewise represent new ways for all sorts of business models combining apps, services and then network access.

As all disputes over spectrum policy are rooted in perceived business advantage, so too are debates over shared spectrum and unlicensed spectrum.

That is normal. What is atypical is the vast potential amount of new spectrum to be made available in many markets, plus the unprecedented effort to create open source models and therefore costs across data center and now access platforms.

To some extent, all ISPs and access providers will benefit from lower platform costs. But that’s the rub: the same shift to lower costs that helps incumbents also enables new potential roles for attackers.

“Dumb pipe” poses the same sort of contradictory implications. On one hand, dumb pipe Internet access now drives revenue growth for mobile and fixed service providers alike, as traditional revenues earned from voice and messaging fall.

On the other hand, such commoditized access does not necessarily drive the same level of profits as the former managed services once did (though there is room for true argument on that score, at least for the moment).

The longer-term strategic issue is simply that there will be so much new spectrum, available at potentially lower costs, plus advances in access network platforms, that new competitors are expected. Adding more supply, in any market, has clear impact on demand. Just as clearly, lots of new supply has predictable impact on profits.

It is hard to see how the access business can avoid further commoditization.

U.S. Consumers Still Buy "Good Enough" Internet Access, Not "Best"

Optical fiber always is pitched as the “best” or “permanent” solution for fixed network internet access, and if the economics of a specific...