Though communications service providers might prefer growth, they probably would not be too perturbed about a service whose take rates or adoption was declining less than 0.7 percent annually, especially when average revenue per unit was growing about four to five percent annually.
But that is about the industry-wide dip in subscriber numbers in the U.S. market, it appears, according to Forrester Research. With the important caveat that most trends in technology, media or communications have adoption rates that vary tremendously over time, if the slow decay of traditional video subscriptions continues at present rates, change will be graceful, for content providers, video distributors and others in the ecosystem.
Of course, when new products displace older products, there tends to be a longish period when nothing too dramatic seems to happen. And then there is the inflection point, and change becomes non-linear.
So as logical as it might seem to base actions on the theory that "tomorrow will pretty much be like today," that will prove a dangerous notion if change goes non-linear. And big changes that affect "most people" often have a non-linear adoption pattern, in the end.
If that is the case, we might not be able to infer very much from current trends in subscription TV.
Mobile adoption, for example, shows the non-linear adoption pattern. Adoption patterns, especially in India and sub-Saharan Africa illustrate the difference between pre-inflection point growth and post-inflection point growth.
The same is likely to happen to subscription TV, if online delivery really develops as a substitute product.
Tuesday, February 11, 2014
Will TV Delivery Go Non-Linear?
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Public Wi-Fi Does Not Have to "Compete" with Mobile to Provide High Value
Some questions never go completely away. Whether Wi-Fi can compete with mobile networks seems a perennial question. It was asked of 3G networks and now is asked about 4G networks.
Some mobile service providers, including Scratch Wireless and Republic Wireless , actually build their mobile services on primary use of Wi-Fi connections, automatically defaulting to Wi-Fi for voice and Internet access whenever possible, and then switching to 3G only when Wi-Fi connections are not available.
As newer blocks of spectrum (5 GHz, 60 GHZ) are opened up for commercial use, the questions--and the potential--are likely to grow. But the questions will be asked in a new context.
Wi-Fi is a low-power application, compared to mobile service, which is a high-power application. Over time, the predominant use of a mobile device--especially a smartphone--has shifted from apps where high power is required (on the go calling, texting or Internet access) to application scenarios are well suited to low power.
That puts the older question--can Wi-Fi compete with mobile--into a new context. Originally, the question might have been whether public Wi-Fi could approximate the connectivity of a mobile network for purposes of making phone calls.
The original value proposition for mobile phones was “calling on the go.” So the potential use of public Wi-Fi was to create a viable network to support calling. These days, smart phones are multi-function devices, used for calling, texting, messaging and content consumption.
And the use venues are different as well. These days, perhaps only 10 percent to 20 percent of total mobile device usage, for all apps and purposes, actually happens when people are “on the go.” all the rest of the usage is in untethered mode at locations where there is Wi-Fi access.
In other words, content consumption now is a major mobile device activity, and most of that consumption does not happen in mobile mode. In other words, the new pattern for content access is primarily untethered, not mobile.
And one might argue that future needs for network capacity increasingly will focus on low-power, localized access, not high-power mobile access. That is why one hears so much about small cells and carrier Wi-Fi, as well as Wi-Fi offload, these days.
That casts the question of whether public Wi-Fi can compete with mobile networks in a new light. End user requirements and device usage have changed.
The crucial need is not so much the usefulness of public Wi-Fi to support voice, but its usefulness in supporting content consumption. Public Wi-Fi might have some value for occasional offload of voice, but it has high value for offload of content consumption, in large part because content consumption does not typically require handing off sessions from one macrocell to another.
The big change to the way we interact with WiFi will only be seen as the HotSpot2.0 standard gains wider adoption. This initiative is based largely on the 802.11u standard and will genuinely transform the industry.
The Hotspot 2.0 standard should improve the utility of public Wi-Fi further, allowing an easier authentication experience, especially for users of public Wi-Fi hotspot services.
So the new question is not so much whether public Wi-Fi can compete with mobile networks, but whether public Wi-Fi will be useful for smart phone content consumption, to say nothing of providing meaningful primary access for devices that also can default to mobile networks when required.
That is a different question than we used to ask. And the point is that the usefulness of public Wi-Fi networks will be dramatically higher as more device communications is supplanted by device media and content consumption.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Grande Communications Launches Gigabit Service in Austin, Texas in February 2014
Grande Communications plans to launch gigabit access service in parts of Austin, Texas by Feb. 18, 2014, beating both Google Fiber and AT&T in launching 1 Gbps services in Austin.
Longer term, one has to expect a resetting of price and value expectations for all the slower-speed services, even if there is no wholesale resetting of prices in the near term.
Grande will sell gigabit service for $64.99 per month, undercutting. AT&T’s “GigaPower” service for $70, as well as Google Fiber, also selling for $70 a month.
Grande says its service will have no contracts or bandwidth usage caps, and initially will be available in west Austin neighborhoods, representing about 25 percent of Grande’s addressable market in Austin.
If take rates are as high as Grande expects, the gigabit service will be rolled out to other Grande-served neighborhoods as well.
What isn’t immediately clear is what impact the gigabit offer will have on take rates for other Grande offers in Austin.
Grande has launched faster speeds offers in Austin, Corpus Christi, Dallas, Midland, Odessa, San Antonio, San Marcos and Waco. Though markets outside Austin will not be affected directly, demand in west Austin clearly will be affected.
The new Internet speed options begin at 15 megabits per second for $35 a month and include other offers at 50 Mbps ($45 a month) for 75 Mbps ($55 a month) and 110 Mbps ($65 a month).
In the west Austin neighborhoods where gigabit service is available, the 110 Mbps service obviously will be cannibalized by the gigabit service, which costs the same. But the other issue is how many customers will determine that a gigabit service for $65 is preferable to a 75 Mbps service for $55 a month.
Longer term, one has to expect a resetting of price and value expectations for all the slower-speed services, even if there is no wholesale resetting of prices in the near term.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
8 ISPS Respond to Gigabit Network RFI
With the caveat that "interest" might not actually represent "intent to provide," eight Internet service providers have responded to a request for intormation on gigabit Internet access networks issued by the Louisville-Jefferson County Metro Government.
The request sought information about how a gigabit-capable network could be provided across the city or in targeted commercial corridors and in residential areas and how gigabit service could be delivered at prices comparable to other gigabit fiber communities across the nation.
City officials hope to lure a vendor who will provide commercial gigabit Internet service at symmetrical speeds, up and down. Time Warner Cable and AT&T are said to be among the firms that responded to the request for information.
City officials apparently are willing to entice would-be providers by allowing access to city rights of way, including waterlines, sewer lines and alleys.
In some ways, the RFI illustrates new thinking about ways municipalities can offer inducements to ISPs interested in creating new gigabit access networks. Municipal officials in this case do not seem to envision a full public-private partnership, but instead simply easier and presumably low-cost access to rights of way and conduit that would allow ISPs to cut the cost of building any such new networks.
What will be interesting is whether municipal officials are willing to allow partial builds only in parts of the metro area, instead of mandating 100-percent coverage, which might not be feasible.
In essence, that would mimic the way competitive local exchange carriers have tended to create new networks, building only where there is clear customer demand. In the case of CLECs focused on business customers, that has meant focusing on business districts and office parks, for example.
In a consumer context, the same approach might allow ISPs to build only in residential neighborhoods where customer demand was high enough to promise a potential financial return.
Though the concept clashes with historic notions about universal access, such approaches have proven effective at stimulating new network capacity.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, February 10, 2014
Sprint Execs "Surprised" by Opposition to T-Mobile US Bid?
The odds against a Sprint bid to acquire T-Mobile US seem to as long as ever. Sprint Chairman Masayoshi Son and Chief Executive Dan Hesse reportedly were "surprised" by U.S. Justice Department and the Federal Communications Commission opposition to the merger.
They really should not have been surprised. The Justice Department signaled clearly its conviction that the U.S. mobile market already is too concentrated when AT&T tried to buy T-Mobile US. None of that has changed over the past two years.
In fact, some might say a T-Mobile US resurgence works against any attempted acquisition, as it suggests meaningful competition is possible under the present market structure.
Whether heightened competition is possible over the longer term is likely the bigger issue. Many would argue that neither Sprint nor T-Mobile US have the financial ability to weather a prolonged marketing war that reduces average revenue per account and gross revenues.
If that proves to be true, then a merger eventually might be viewed differently, but only after both Sprint and T-Mobile US have become financially more weakened than they are at present.
Almost perversely, an eventual merger of a weaker Sprint, and a weaker T-Mobile US, will make success in the competition with Verizon Wireless and AT&T Mobility even tougher.
But that is a likelihood, if antitrust officials will not allow a merger at present. The old adage about bankers making loans only when the customer doesn't need a loan probably applies here.
Antitrust officials will approve a merger between Sprint and T-Mobile US only when it is too late to prevent creation of an effective mobile duopoly.
As sometimes, perhaps often happens, current policies will create precisely the outcomes the said policy hopes to avoid.
They really should not have been surprised. The Justice Department signaled clearly its conviction that the U.S. mobile market already is too concentrated when AT&T tried to buy T-Mobile US. None of that has changed over the past two years.
In fact, some might say a T-Mobile US resurgence works against any attempted acquisition, as it suggests meaningful competition is possible under the present market structure.
Whether heightened competition is possible over the longer term is likely the bigger issue. Many would argue that neither Sprint nor T-Mobile US have the financial ability to weather a prolonged marketing war that reduces average revenue per account and gross revenues.
If that proves to be true, then a merger eventually might be viewed differently, but only after both Sprint and T-Mobile US have become financially more weakened than they are at present.
Almost perversely, an eventual merger of a weaker Sprint, and a weaker T-Mobile US, will make success in the competition with Verizon Wireless and AT&T Mobility even tougher.
But that is a likelihood, if antitrust officials will not allow a merger at present. The old adage about bankers making loans only when the customer doesn't need a loan probably applies here.
Antitrust officials will approve a merger between Sprint and T-Mobile US only when it is too late to prevent creation of an effective mobile duopoly.
As sometimes, perhaps often happens, current policies will create precisely the outcomes the said policy hopes to avoid.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, February 9, 2014
IP Interconnection is Changing, Because the Commercial Relationships and Traffic Flows are Changing
IP network interconnection periodically erupts as a business issue between two or more interconnecting IP domains, and the problems will grow as the types of interconnecting domains diversify.
The interconnection issue further is complicated by the types of domains. Interconnections can occur between scores of thousands of “autonomous systems,” also called “routing domains.”
Though most of the autonomous systems are Internet service providers, interconnections also occur between enterprises, governmental and educational institutions, large content providers with mostly outbound traffic such as Google, Yahoo, and YouTube, as well as
overlay content distribution networks such as Akamai and Limelight.
In other words, end users, application, service and “access” and “wide area network” providers now are among the entities interconnecting, complicating any potential frameworks for regulating such diverse entities in ways that promote investment and innovation.
Where separate “common carrier” regulation arguably was easier, in the sense that only licensed “carriers” could interconnect, these days, application providers including Google, Apple, Netflix and others operate their own IP networks, interconnecting with carriers and non-profit entities alike.
The interconnection of IP networks historically has been a matter of bilateral agreements between IP network owners, with a tendency to interconnect without settlement payments so long as traffic flows were roughly balanced (the same amount of sent and received traffic on each interconnecting network).
As you can imagine, highly asymmetrical traffic flows such as streaming video will upset those assumptions. That matters, as a practical business matter, since interconnection costs money if traffic flows are not equivalent, or if domains are of disparate size.
Historically, the major distinction among different ISPs was their size, based on geographic scope, traffic volume across network boundaries or the number of attached customers. But symmetry encouraged the “peering” or “settlement-free interconnection” model.
Those assumptions increasingly are challenged, though. Today, a smaller number of large networks exchange traffic with many smaller networks. And there are cost implications.
In an uncongested state, a packet that originates on a network with smaller geographic scope and ends up on the larger network might be expected to impose higher delivery costs on the larger network (which must typically carry the packet a greater distance).
A larger network would presumably have more customers, and this might be seen as giving the
larger network more value because of the larger positive network externalities associated with being part of their networks.
Perhaps more important, even networks of similar size have different characteristics. Consumer-focused “access” providers (cable and telcos) are “eyeball aggregators.” Other ISPs, such as Netflix, are content stores. That has practical implications, namely highly asymmetrical traffic flows between the “eyeball aggregators” and “content stores.”
Also, there are natural economies of scale for a wide area network-based ISP than for an “access” ISP that has to supply last mile connections. Even when traffic flows actually are symmetrical, network costs are unequal.
The point is that settlement-free peering worked best when networks were homogenous, not heterogeneous as they now have become. Like it or not, the traditional peering and transit arrangements are less well suited to today’s interconnection environment.
For that reason, “partial transit” deals have arisen, where a network Z sells access
to and from from a subset of the Internet prefixes.
For instance, Z may sell A only the ability to send traffic to part of the Internet, but not receive traffic. The reverse may also occur: a network may be allowed to receive traffic but not send traffic.
That arrangement is intended to reflect asymmetrical traffic flows between content store and eyeball aggregator networks.
Those changes in traffic flows, which bifurcate along content store and eyeball aggregator roles, inevitably will disrupt interconnection economics and business arrangements, leading to demand for imposition of common carrier rules for interconnection of IP networks.
Oddly enough, the logic of common carrier rules might lead to precisely the opposite “benefit” some expect.
Disagreements by parties to a bilateral interconnection agreement can lead to service disruptions, if one network refuses to accept traffic from another network on a “settlement free” basis.
So some might call for mandatory interconnection rules, to end peering disputes. Such rules could make the problem worse.
Interconnection disagreements today are about business models and revenue flows. Content stores benefit from settlement-free peering, since they deliver far more traffic than they receive.
Eyeball aggregators often benefit from transit agreements, since they would be paid for the asymmetric traffic flows.
Unless the assumption is that network economics are to be disregarded, the way common carrier rules would work, if applied to IP networks in a manner consistent with common carrier regulation is that a network imposed an asymmetric load on a receiving network would have to pay for such access.
Disputes over “peering” between IP domains sometimes leads to service disruptions viewed as “throttling” of traffic in some quarters. It is not “throttling,” but a contract dispute.
The relationships between discrete IP networks take three forms. Large networks with equal traffic flows “peer” without payment of settlement fees.
Networks of unequal size tend to use “transit” agreements, where the smaller network pays to connect with the larger network, but also gets access to all other Internet domains. Also, in many cases one network pays a third party to provide interconnection.
Peering and transit rules are going to change, if only because the business interests of IP domain owners are distinct. The issue is whether such change will change to reflect the actual commercial interests, or take some form that is economically irrational.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Saturday, February 8, 2014
Internet Access Prices are Dropping, in "Real" Terms
Historically, as most observers will readily agree, Internet access prices per bit have dropped.
But many would argue that absolute prices have not dropped.
In many cases, consumers have paid about the same amount of money on a recurring basis but have gotten better performance, in terms of access speed, many would argue.
It is a nuanced issue. In some cases, absolute prices might have climbed, on average.
So how can one claim that prices have declined, as some note. Prices declined 82 percent, globally, between 2007 and 2012, according to the International Telecommunications Union, measured as a percentage of income.
That's the key: "as a percentage of income." In some cases, higher absolute prices might represent a lower percentage of household income. So, in "real" terms, prices dropped.
That trend is clear enough, globally, for Long Term Evolution prices, which have dropped in about 73 percent of markets. There also is evidence that U.S. Internet access prices also dropped between 2004 and 2009, for example.
But many would argue that absolute prices have not dropped.
In many cases, consumers have paid about the same amount of money on a recurring basis but have gotten better performance, in terms of access speed, many would argue.
It is a nuanced issue. In some cases, absolute prices might have climbed, on average.
So how can one claim that prices have declined, as some note. Prices declined 82 percent, globally, between 2007 and 2012, according to the International Telecommunications Union, measured as a percentage of income.
That's the key: "as a percentage of income." In some cases, higher absolute prices might represent a lower percentage of household income. So, in "real" terms, prices dropped.
That trend is clear enough, globally, for Long Term Evolution prices, which have dropped in about 73 percent of markets. There also is evidence that U.S. Internet access prices also dropped between 2004 and 2009, for example.
A 2011 study by the International Telecommunications Union, for example, shows consumers and businesses globally are paying on average 18 percent less for entry-level services than they did in 2009, and more than 50 percent less for high-speed Internet connections, the ITU found.
Relative prices for mobile cellular services decreased by almost 22 percent from 2008 to 2010, while fixed telephony costs declined by an average of seven percent.
Relative prices for mobile cellular services decreased by almost 22 percent from 2008 to 2010, while fixed telephony costs declined by an average of seven percent.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...