Thursday, June 26, 2008

Cable TV Model for Some Parts of the Content Ecosystem

"An unconstrained profit-maximizing platform charges a positive fee to the other side of the market if and only if content providers value additional consumers higher than consumers value additional content providers."

In other words, platform and service providers have opportunity to earn revenue from content partners when new, emergine or highly-focused content partners want expedited carriage, placement or promotion on platform portals.

It's the same sort of thing the cable industry long has had as a business practice. Popular networks get paid, low-viewership networks often must pay to get carriage (shelf space). In a service provider context, the analogy is that promotion, targeting, location, billing and other features and services can be so useful a content partner might be willing to pay to obtain them.

If, on the other hand customers highly value a particular content provider, a rational platform simply will make sure the popular provider is well supported, and will do nothing to impede customer access.

It's still an emerging sort of thought, and the services and applications platforms can offer partners isn't so well developed. But it is coming.

IP-Based VPNs to Surpass Frame Relay, ATM in 2009

2009 should be the year the installed base of IP-based virtual private networks surpasses the installed base of frame relay and asynchronous transfer mode sites, Vertical Systems Group forecasts suggest.

The shift has been going on for years, but a crossover point would be significant, as it will be when the installed base of IP phone systems surpasses that of digital systems.

Adoption of the ascendant technology gets a boost when vendors begin to slow and end development of legacy applications and gear.

Sprint Wins with Instinct

Sprint Nextel Corp. has broken company sales records with its new Samsung Instinct, leading to temporary shortages of the touch-screen phone in some stores, the company says. Despite placing what it calls the largest-ever initial order for a 3G handset, Sprint still underestimated demand, apparently.

E-mail access, Internet browsing, GPS navigation tools, interactive maps and one-touch click-to-call access have met "extremely heavy use," the company says.

The phone costs $129.99 with a two-year contract and a $100 mail-in rebate. The obvious observation: iPhone has had a transforming impact on handset design.

Intel Sees "No Compelling Case" for Vista

New York Times staff reporter Steve Lohr says Intel has decided against upgrading the computers of its own 80,000 employees to Microsoft’s Vista operating system

“This isn’t a matter of dissing Microsoft, but Intel information technology staff just found no compelling case for adopting Vista,” an Intel source says.

To be sure, large enterprises have all sorts of applications that might have to be upgraded or modified when making a major operating system change. Consumers don't generally have those problems.

Still, it's a bit striking when a major Microsoft partner makes a decision like that. Chipmakers like it when new operating systems and apps require lots more powerful processors and lots more memory. Except when it's their money, apparently.

ICANN Changes Domain Name Scheme

The Internet Corporation for Assigned Names and Numbers has voted unanimously to relax rules "top-level" domain names, such as .com or .uk. by 2009.

The decision means that companies could turn brands into Web addresses, while individuals could use their names.

Domain names written in scripts, such as Asian and Arabic, also were approved.

At the moment, top-level domains are limited to individual countries, such as .uk (UK) or .it (Italy), as well as to commerce, .com, and to institutional organisations, such as .net, or .org.
BBC infographic showing domain name sales

Under the new plans, domain names can be based on any string of letters. Individuals will be able to register a domain based on their own name, for example, as long as they can show a "business plan and technical capacity".

Companies will be able to secure domain names based on their intellectual property, such as individual brand names.

Broadband "Over-supply" Problem

Cox Communications is doubling the download speed of its most popular residential Internet service (Preferred) in Northern Virginia and increasing the speed of its Premier service by thirty-three percent.

PowerBoost, which supplies temporary "burst" bandwidth for uploads, also has been added to the Preferred and Premier packages. PowerBoost for downloads has been available since 2007. This is the fifth consecutive year that Cox has enhanced the speed of its Internet services in northern Virginia (credit Verizon's FiOS service for that).

Verizon has boosted FiOS downstream speeds to 50 Mbps, with 20 Mbps upstream, for its top package, available everywhere FiOS is sold.

Cox customers will get the speed increases automatically in July, without need for a call or technician visit.

The PowerBoost feature means uses of the Preferred package will experience speeds up to 12.5 Mbps down/2.5 Mbps up. Premier customers can achieve 25 Mbps down/3.5 Mbps up.

Policy advocates often complain about the U.S. "broadband problem." Sometimes they mean it isn't available, isn't fast enough or costs too much. The evidence suggests availability isn't a problem. Whether a service is "fast enough" is a matter of interpretation, but I don't see evidence of anything but increasing speeds, often for the same cost. "Price" likewise is an issue.

With the exception of Japan and South Korea, though, cost per Mbps in the United States is quite comparable to nearly all other leading nations.

Complaining about broadband is a bit like similar observations we could easily have made about wireless penetration or use of text messaging, where U.S. users lagged way behind European users for quite some time. That "problem" doesn't exist anymore.

Neither will the "broadband" problem. Have there been issues with availability and speed? Yes. Are those problems in the process of resolution? Yes. Pointing to the existence of problems is fine. Ignoring clear evidence that problems rapidly are being fixed is either misinformed, intellectually dishonest or sloppy.

Some people like to say the definition of broadband is a problem, pointing to data collection that defines "broadband"--at minimum--as 200 kbps. That is wrong, also. The FCC recently changed its minimum definition to 768 kbps. A couple of points.

The only definition the global telecom industry ever has formally set was way back when ISDN was created. Broadband still formally is defined as any bit rate over "voice" rates of 64 kbps. So 128 kbps "traditionally" has been considered "broadband."

Market have moving definitions. But you can hardly fault the FCC for initially setting a minimum standard that is in fact above the recognized global nomenclature. In recent practice, industry executives might have considered broadband to be 1.544 Mbps or above, while anything between 64 kbps and 1.544 Mbps is "wideband."

All that is meaningless. It will be even more meaningless when cable operators start branding some broadband speeds as "wideband," to suggest it is more bandwidth than "broadband." Markets may like that. But it doesn't change the only formal definition the global engineering community ever has embraced.

Also, "minimum" is one thing. "Maximum" or "mean" are other things. Megabit access now is the norm. Targets will continue to shift higher over time. Call it the broadband version of grade inflation. The minimum "passing" grade might be a "D." That doesn't mean people expect that to be the norm.

The United States once had a major "broadband" availability problem. It no longer has. There are places where "access" by wire remains a problem. Most of those places have satellite alternatives, though. And many places have fixed wireless access as well.

Honestly, most potential users have one or two wired networks to choose from, two satellite providers and two or three mobile providers. Many consumers soon will be able to choose from as many as five mobile broadband providers.

Under-supply won't be an issue for most, much longer. Over-supply is the looming problem.

Compute Remotely, Assemble Locally

There's an obvious relationship between cloud computing, "over the top" applications, open networks, open devices, importance of application program interfaces.

The ability to compute, store data and execute code remotely means it is more affordable than ever for small developers and individuals to create applications that are immediately available to users anywhere. The existence of those applications "on the Web" makes the Web a more-powerful platform for bringing applications of any sort to market. That puts business pressure on walled garden business models of all sorts.

The existence of cloud computing also means software is becoming unbundled from hardware to a large extent. Not completely unbundled; not unbundled for every application or service. In fact, some apps require tight integration to execute with the greatest elegance. But the direction is more in the direction of how people use PCs than how they consume cable television.

The application explosion, built on open platforms and APIs, also means new applications can be built on the shoulders of existing apps and applets. Assembling apps begins to be a process akin to what one does with Legos, to oversimplify.

That also means apps more often are created globally, assembled locally. That has implications for browsers, networks and protocols. To assemble apps locally means a premium for rapid response. If assembled apps are to mimic the feel of locally-stored apps, response time is a crucial requirement. This requires more than big, fast pipes. It means browsers that are much faster than we have used in the past. It means a computing architecture that does not require so much traversing of wide area networks to grab app elements.

The issue is to answer a question: “How do I pair together one customer that’s CPU-intensive and another that’s IO-intensive and have the sum appear just like a single, well performing application?”.

There is lots of room for innovation here. And lots of distance to cover. But it's coming, even if most users only gradually are being exposed to use of remote and locally-assembled apps.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...