Saturday, July 16, 2016

How to Choose a Co-Location Services Provider

There is a simple why businesses of all sizes continue to invest more in data storage and computing. Few, if any, modern businesses can operate without information technology.

According to Cisco, data center workloads will more than double from 2014 to 2019, for example.

When large businesses reach about 75 percent of computing and data storage capacity, they begin to evaluate alternatives for adding capability.

Most often, the choices involve most often involve investing in colocation capacity or buying cloud storage, instead of building new owned computing facilities, according to 451 Research.

Why Colocate?

Colocation makes sense when owned hardware is nearing end of useful life. In other cases, decisions to “lease, rather than build” are driven by staff resources inadequate to manage the upgrades.

Also, colocation and outsourcing make sense  when firms want to reduce costs or security and compliance chores.

That is why the colocation market is forecast to grow at more than 12 percent annual rates between 2015 and 2019, according to researchers at Technavio.

That is driving the $23 billion annual revenues colocation business, 45 percent of which is activity in North America.  

Even firms presently using colocation can add incremental resources affordably by using cloud computing in a hybrid mode, as an alternative to buying and managing additional hardware.

Firms can buy computing or storage infrastructure “as a service.” In an IaaS model, a third-party provider hosts hardware, software, servers, storage and other infrastructure components on behalf of its users.

In other cases firms might choose to use public cloud, private cloud or “hypervisors as a service” (use of virtual machines).

In a public cloud scenario, businesses essentially rent computing cycles or storage on servers operated on a shared basis.

In a private cloud environment, businesses buy dedicated use of resources not shared with other customers.

“Hypervisor” is a method of efficiently running any application on “virtual machines” without having physical copies of software loaded to support operating systems.

What Questions Should You Ask, When Buying Colocation Services?

Colocation always involves space and power. Any colocation facility should have room to accommodate not only your present requirements, but future growth. Also, stability and reliability of power systems are essential for your own equipment and the data center overall, to operate servers and keep them cool.

But technical support also is crucial. Look for a provider with a seasoned staff and proven credentials. The staff has to be able to diagnose and fix any potential issues that could compromise the performance and security of your equipment and data, quickly and cost effectively.
Storage-as-a-service

Every business requires a convenient way to store key data. Storage as a service allows businesses to save and retrieve business data reliably and affordably, without manual intervention by company staff.

Using a SaaS service, a customer can specify what data must be stored, how often it should be saved and how fast it can be retrieved in the event of any data loss by primary systems.

Service level agreements can help assure that data is securely backed up and quickly restored if necessary.
Cloud Storage

Cloud storage is a service that maintains, manages and backs up key business data remotely, while available to users over a network  that typically is the Internet.

As with all other cloud services, businesses can buy public, private or hybrid service. Generally, public cloud storage is best for unstructured data. Private cloud arguably is more appropriate when businesses need more customization and control. Hybrid cloud might be best when a business wants access to actively used and structured data in a private cloud, while archival data can be kept in a public cloud.

Disaster Recovery-as-a-Service

DRaaS enables the full replication and backup of all business data and applications. It allows an organization that has experienced major or total failure of primary systems to continue with daily business processes while the primary system undergoes repair.

DRaaS also allows these applications to run on virtual machines (VM) at any time, even without a real disaster. That is useful if a business wants a “sandbox” to prototype or test major new applications without exposing or interfering with  primary systems

Direct connect to cloud service providers

Your data center connections to other partners must be reliable, safe, and fast. Dedicated connections are faster and feature less latency than Internet connections. That is important for important performance-sensitive applications such as video or voice communications and “virtual desktop” apps.

Direction connections minimize your disaster recovery response times, and allow large data transfers.

Backups

Backup as a service provides an automated and managed way to preserve key business data, Cloud backup, also known as online backup, is a service that automatically, on a fixed schedule, collects, compresses, encrypts and transfers data to a remote storage location.

Hypervisors-as-a-Service

A hypervisor, also called a virtual machine manager, is a program that allows multiple operating systems to share a single hardware host. Each operating system appears to have the host's processor, memory, and other resources all to itself.

Hypervisor as a service allows a business to buy that functionality without having to manage the process, maintain and update the virtual machines.

Every colocation service begins with space and power, but companies need to future-proof their decisions.

A colocation provider should have the technical and human ecosystems to provide direct access and cross-connects to a number of managed services providers and potential customers, while
Supporting and monitoring a business information technology environment 24x7x365, all in one facility.

Friday, July 15, 2016

Is Special Access Market Competitive, or Not?

According to the U.S. Federal Communications Commission, the total market for U.S. special access services is roughly $40 billion annually. The FCC believes the market is insufficiently competitive.

But observers note that the FCC itself reported U.S. telcos had 92 percent market share in special access in 1980. Telco share had dropped to 39 percent in 2013. In that year, TDM-based services represented $25 billion of the total market, while incumbent telcos accounted for about $16 billion of the TDM market, according to the FCC’s own data.

In some markets, 39 percent might well represent “dominance.” Whether that is the case in the special access market is the issue.

With the caveat that usage or bandwidth is not the same thing as revenue, Ethernet access provided by a wide range of competitors now represents market growth, while the legacy T1 and DS3 market declines.  

In fact, cable TV companies are big suppliers of Ethernet access and other access services to business buyers.

AT&T has argued that “facilities-based competitors are serving 95 percent of all MSA census blocks (on average, about one seventh of a square mile in an MSA) nationally where there is demand for special access services, and, second, that 99 percent of all business establishments are in those census blocks.”

For the first time ever, the U.S. Federal Communications Commission also is seeking to extend some special access obligations to cable TV networks for the first time.

And some argue that it is impossible to fully do a data-driven analysis because the FCC is not releasing the full results of its earlier survey of facilities, an exercise intended to provide some rationale for assessing the existence of facilities-based competition.

Some might find the emphasis on extending regulation to a declining service curious. AT&T’s access lines have declined by almost 65 percent since 2009.

A Historic Shift from Scarcity to Abundance has Been Underway for At Least 2 Decades

Though it perhaps is surprising, very little direct discussion of the role of scarcity in the telecommunications business happens these days. That is manifestly not because people are unaware of its importance.

Contestants are very much aware of the role “scarcity” plays in the access business. In fact, attempts to maintain scarcity are a foundational part of strategy for some contestants, while efforts to end scarcity and create abundance likewise underpin the strategies undertaken by attackers.

The reasons are drop-dead simple: scarcity creates higher profit margins and higher revenue, as is true in any market. Abundance lowers profit margins and gross revenue, in any market.

Some rightly would argue that a ubiquitous access network (mobile, cable TV, fiber to home or copper to the home) is expensive, limited the number of providers that can exist in any market.

The point is that, with the advent of an era of abundance, the barriers are going to fall, and fall substantially.

So the possible “bad news” for some access providers is that the historic scarcity of resources in the access network is going to be replaced by abundance. The “good news” for app providers is that access capacity is going to be less and less a barrier to their business models.

To be clear, the end of the age of scarcity, and the start of the era of abundance, is coming, for the bandwidth portions of telecommunications business, and will force dramatic rethinking of business models.

As access services drive less revenue volume and produce lower profits, access providers will move into other parts of the Internet ecosystem, just as app providers are moving into the access and device portions of the ecosystem.

The trend is not actually so new. In fact, abundance has been approaching for decades, in part because of advances in use of spectrum, the impact of Moore’s Law and competition itself.

The implications could not be more profound. For more than a century, “scarcity” has been the fundamental reality of the industry and the business.

Networks were expensive, time-consuming and bandwidth limited.

In some ways, scarcity still drives the equity value of fixed and mobile networks. Fixed access networks are terribly expensive to build and operate, which is why there are so few of them in any market.

Advances are happening, but the “rule of a few” still holds, as what is scarce are enough customers to support the building and operation of a ubiquitous fixed access network in the face of two or more other providers.

But scarcity and abundance are starting to coexist. Moore’s Law helps. Better signal processing and antenna arrays help. Unlicensed spectrum and Wi-Fi also help. Optical fiber helps, even if some in the recent past have argued that scarcity and pricing power would return to the access business when optical fiber becomes ubiquitous.

Fixed wireless helps. Spectrum sharing helps as well.  

But much more is coming. The U.S. Federal Communications Commission is moving to make  available an extraordinary amount of new spectrum--including seven gigaHertz (7 GHz)  worth of unlicensed spectrum, in the millimeter wave bands, and a total of 11 GHz, including 3.85 GHz of licensed spectrum, in a first wave.

Nor is that all. The Commission also adopted a Further Notice of Proposed Rulemaking, which seeks comment on  rules adding another 18 GHz of spectrum encompassing eight additional high-frequency bands, as well as spectrum sharing for the 37 GHz to 37.6 GHz band.

Keep in mind that the new allocations represent many times more spectrum than all other existing spectrum now available for mobile and wireless communications in the U.S. market. Just how much more depends on one’s assumptions about coding techniques and modulation.

But it is possible the new spectrum will represent an order of magnitude or two orders of magnitude more communications spectrum than presently is available for mobile and wireless communications purposes.

Abundance will transform business models. Incumbents who built their businesses on scarcity will have to rework those models. App providers whose businesses are built on the assumption of abundance will flourish, at least potentially.

National Science Foundation to Spend $400 Million on Next-Generation Wireless (Spectrum Sharing and Millimeter Wave Among The Technologies)

The U.S. National Science Foundation will spend more than US$400 million over the next seven years to fund next-generation wireless research in an effort to bring super-fast mobile service to the country.


Support for millimeter wave technology and spectrum sharing are among the areas the NSF will target. Both subjects are among the ssues to be discussed by speakers at Spectrum Futures in Singapore, 19-21 October, 2016.

U.S. officials hope the investments will speed up the county's move to next-generation 5G mobile service, potentially offering speeds of 10Gbps, and allow for a rapid expansion of the internet of things.


The next-generation mobile services will enable self-driving cars, an "always on" IoT, smart cities, new virtual reality offerings, and video to aid police, firefighters, and emergency medical responders.

10 Million IoT Developers by 2020?

Millions of new developers are going to be working on Internet of Things applications, leading a growing number of potential ecosystem participants to prep for potential new roles in the IoT business, beginning with developer support and platforms for industrial Internet of Things, for example.

AT&T and IBM  now are working together to help developers--and the enterprises supporting them--create apps related to Internet of Things (IoT).

Separately, GE and Microsoft are working on an IoT platform support initiative of their own, working to make GE’s Predix platform for the Industrial Internet available on the Microsoft Azure cloud for industrial businesses.

The move marks the first step in a broad strategic collaboration between the two companies, which will allow customers around the world to capture intelligence from their industrial assets and take advantage of Microsoft’s enterprise cloud applications.

According to the VisionMobile, nearly 10 million developers will be active in IoT by 2020, doubling the estimated five million working in the IoT area today.

IBM and AT&T are expanding their investment in open-source based tools, such as Node-Red, and open standards like MQTT, all essential for creating IoT solutions.

In addition, IBM's Watson cognitive computing and AT&T's IoT Platforms like Flow Designer and M2X, and access to its global network, will be pushed as development and execution tools.  

Thursday, July 14, 2016

The Age of Scarcity is Ending, the Era of Abundance is Beginning

The end of the age of scarcity, and the start of the era of abundance, is coming, for the telecommunications business. The implications could not be more profound. For more than a century, “scarcity” has been the fundamental reality of the industry and the business.

Networks were expensive, time-consuming and bandwidth limited.

In some ways, scarcity still drives the equity value of fixed and mobile networks. Fixed access networks are terribly expensive to build and operate, which is why there are so few of them in any market.

Advances are happening, but the “rule of a few” still holds, as what is scarce are enough customers to support the building and operation of a ubiquitous fixed access network in the face of two or more other providers.

But scarcity and abundance are starting to coexist. Moore’s Law helps. Better signal processing and antenna arrays help. Unlicensed spectrum and Wi-Fi also help. Optical fiber helps. Fixed wireless helps.

But much more is coming. The U.S. Federal Communications Commission is moving to make  available an extraordinary amount of new spectrum--including seven gigaHertz (7 GHz)  worth of unlicensed spectrum, in the millimeter wave bands, and a total of 11 GHz, including 3.85 GHz of licensed spectrum, in a first wave.

Nor is that all. The Commission also adopted a Further Notice of Proposed Rulemaking, which seeks comment on  rules adding another 18 GHz of spectrum encompassing eight additional high-frequency bands, as well as spectrum sharing for the 37 GHz to 37.6 GHz band.

Keep in mind that the new allocations represent many times more spectrum than all other existing spectrum now available for mobile and wireless communications in the U.S. market. Just how much more depends on one’s assumptions about coding techniques and modulation.

But it is possible the new spectrum will represent an order of magnitude or two orders of magnitude more communications spectrum than presently is available for mobile and wireless communications purposes.

Abundance will transform business models. Incumbents who built their businesses on scarcity will have to rework those models. App providers whose businesses are built on the assumption of abundance will flourish, at least potentially.

FCC Says Telcos No Longer "Dominant," in One Sense

In something of milestone ruling, the U.S. Federal Communications Commission has said, as part of a decision on decommissioning of old time division multiplex networks, that local voice providers “are no longer dominant in the market for connecting local callers to long-distance networks.

That still is not a full fuling that local telcos are no longer dominantfor other purposes, but it is a step in what some would say is the right direction. It seems only a matter of time before the whole notion of “dominant providers,” and the special restraints placed on them, are reduced.
Cable TV companies arguably already have displaced telcos as the “dominant” suppliers of high speed Internet access while mobile service has displaced landlines as the “dominant” way people use voice and messaging.

“The increasing popularity of mobile wireless, cable Voice over IP services and regulatory changes combined to erode the dominant position of local carriers in the market for interstate switched access,” the FCC noted.

The other helpful decision, from a telecom industry viewpoint, is that the new rules streamline the process of ending  legacy TDM-based voice service and supplying such services on a next-generation network, in as few as 30 days.

Applicants must show that:
  • network performance, reliability and coverage is substantially unchanged for customers
  • access to 911, cybersecurity and access for people with disabilities meets current rules and standards is proven
  • Compatibility with a defined list of legacy services still popular with consumers and small businesses, including home security systems, medical monitoring devices, credit card readers and fax machines, subject to sunset in 2025, is assured.

Ironically, some would note, even as pressure mounts for building of new next-generation optical fiber networks, some still insist that legacy TDM networks--and legacy services--be maintained as well.

To the extent that operating two ubiquitous networks, where one network has fewer and fewer customers, raises operating costs and wastes capital investments, the older networks should be retired faster, not slower.

Yes, there are some consumer effects when a legacy network is terminated. But we have experience wiith such things. First generation netwoerks were shut off in favor of second generation networks.

Those 2G networks, in turn, will be shut off in favor of 3G. Consumers need time to migrate, but there alwys is some disruption, at the margin. Such disruption is no reason to delay the transition process any more than absolutely necessary.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...