Most people, if asked, will tend to say that all competitions ought to be "fair." But what "fair" means, in practice, is harder to describe. A foot race should have all contestants running the same distance, for example. That sounds fair.
But what if the objective is to "normalize" a competition for a full range of contestants with highly-varied skills? In that case, a "handicap" system, as used in golf, might be necessary.
Something of the same conundrum might be said to exist for broadband access services. A notion of "fairness" might suggest that all licensees in a field abide by the same rules. But that rarely happens in our modern IP communications business.
Competitors always argue, and regulatory officials typically agree, that the former monopoly provider has such entrenched advantages that handcuffs need to be kept on the incumbent, at least until such time as the competitors have had a chance to become established.
But we also have the example of industries competing directly under "unequal" rules, such as cable TV and telcos, for example, where cable operators have no wholesale obligations and leading telcos do have such obligations. That is beginning to change, slowly.
But the philosophical issues remain highly charged. Many do not believe, for example, that non-profit entities should be able to compete with for-profit entities when the non-profits can use their tax advantages to do so.
A non-profit government entity should not, in this view, be able to compel purchase of products, or use its taxing authority to raise capital, borrow at favored rates, or employ other advantages no for-profit competitor can match.
That might be one attraction for wider availability of unlicensed spectrum: it can provide the basis for greater competition without raising those other competitive issues. That doesn't mean existing competitors will agree, only that some thorny issues are avoided if non-licensed spectrum is made available.
Millimeter waves in the spectrum from 30 GHz to 300 GHz have not traditionally been usable for communications, even very short range (local distribution between a decoder and a TV, a mouse and a PC, a smart phone an a payment terminal).
Better coding and abundant cheap processing now makes those frequencies usable, in some cases, for the first time.
For some of us, the question is whether any of those frequencies will be usable either for wireless backhaul or access purposes. The big problems lie with the physics. Waves at those frequencies just don't travel that far through air, limiting their effectiveness for network access.
As a figure of merit, assume that a 3 decibel gain represents 100 percent more signal strength, while a 3 dB loss cuts signal strength by 50 percent. As always is the case with free space energy, there is less attenuation at lower frequencies, so 30 GHz to 40 GHz looks interesting, from an access perspective.
The 80 GHz to 100 GHz range looks interesting as well, from an attenuation perspective. Because of the physics of radios, tiny antennae work well at these frequency ranges. There are line of sight and transmit power issues.
In many cases, the other issue is access to the core network (middle mile access). A robust local access network is only as good as the bandwidth and pricing of the connections to the backbone networks.
But maybe everything can align. Unlicensed spectrum, smart people doing the algorithms, lots of people willing to share to build a big "public" or "commmunity" network, a sustainable revenue model and adequate middle mile connections.
To be sure, incumbent service providers will not wish for such a scenario. But the problem probably is just hard enough, and small enough, in terms of revenue impact, to allow some room for experimentation.
And, one hopes, experimentation could lead to new ways of supplying access, without upsetting notions of fairness.
Friday, March 8, 2013
Unlicensed Spectrum and "Fairness"
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Post Comments (Atom)
Will AI Actually Boost Productivity and Consumer Demand? Maybe Not
A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...
No comments:
Post a Comment