Google is preparing to launch an online store in which it will sell third-party business software to Google Apps customers, the Wall Street Journal reports.
The Wall Street Journal says that Google's store could arrive as early as March with the works of third-party developers available as enhancements to Google's office productivity software suite. It appears the store would allow Gmail and Google Docs users to purchase add-ons for niche features too specialized for the mainstream Google Apps product.
The Google Solutions Marketplace contains lists and reviews of third-party software for Google Apps and Enterprise Search, but it does not let you buy the applications directly from Google. That might be what is about to change.
Developers would have to share revenue with Google from sales of their software through the store, and it would be reasonable to assume revenue splits similar to those used by mobile application stores run by Google, Apple, and several other companies.
Typically, the developer gets 70 percent of the revenue.
As iTunes was the "secret sauce" that helped propel the iPod to prominence, and as the App Store has been the surprise attraction for the iPhone, perhaps app stores might provide similar value for service and device providers.
Tuesday, February 2, 2010
Google to Launch App Store for "Google Apps"
Labels:
Google Apps,
software as a service
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
99% of BitTorrent Content Illegal?
A new survey suggests that about 99 percent of available BitTorrent content violates copyright laws, says Sauhard Sahi, a Princeton University student who conducted the analysis.
Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.
That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.
The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.
The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or user-generated content.
By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.
The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.
In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.
All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.
"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.
"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."
Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.
That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.
The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.
The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or user-generated content.
By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.
The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.
In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.
All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.
"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.
"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."
Labels:
BitTorrent,
network neutrality,
P2P,
regulation
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, February 1, 2010
Private Line Market Starts Decline
After years of steady growth, the $34 billion private line services market is entering a period of declining revenue, says Insight Research. It could hardly be otherwise. Just as IP-based services are displacing TDM-based voice, so IP-based and Ethernet-based bandwidth services are displacing SONET bandwidth services, frame relay and ATM services.
U..S enterprises and consumers are expected to spend more than $27 billion over the next five years on Ethernet services provided by carriers, Insight Research predicts. With metro-area and wide-area Ethernet services now available from virtually all major data service providers, the market is expected to grow at a compounded rate of over 25 percent, increasing from $2.4 billion in 2009 to reach nearly $7.8 billion by 2014.
The decline in revenue will continue from 2009 to 2012. But Insight Research also believes private line revenues will tick up a bit after 2012, presumably as additional applications drive demand for more bandwidth. Why the growth would not come in the form of alternative IP bandwidth is not precisely clear, though.
Insight believes additional demand for wireless backhaul and video will lead to more buying of SONET products. Some of us would disagree, but we shall see.
"The transition away from frame and ATM will put a break on overall private line industry revenue growth for a couple of years," says Robert Rosenberg, company president . "However, private line demand remains strong for wireless backhaul, local bandwidth for caching IPTV video services, and for facilitating VoIP."
U..S enterprises and consumers are expected to spend more than $27 billion over the next five years on Ethernet services provided by carriers, Insight Research predicts. With metro-area and wide-area Ethernet services now available from virtually all major data service providers, the market is expected to grow at a compounded rate of over 25 percent, increasing from $2.4 billion in 2009 to reach nearly $7.8 billion by 2014.
The decline in revenue will continue from 2009 to 2012. But Insight Research also believes private line revenues will tick up a bit after 2012, presumably as additional applications drive demand for more bandwidth. Why the growth would not come in the form of alternative IP bandwidth is not precisely clear, though.
Insight believes additional demand for wireless backhaul and video will lead to more buying of SONET products. Some of us would disagree, but we shall see.
"The transition away from frame and ATM will put a break on overall private line industry revenue growth for a couple of years," says Robert Rosenberg, company president . "However, private line demand remains strong for wireless backhaul, local bandwidth for caching IPTV video services, and for facilitating VoIP."
Labels:
capacity,
Ethernet,
private line,
sonet
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Google Nexus One for AT&T?
A device that's almost certainly an AT&T-compatible version of the Google Nexus One has been approved by the Federal Communications Commission. The version now sold by Google works on all T-Mobile USA 3G spectrum. but not on all AT&T 3G bands.
Versions running on Verizon's CDMA air interface and also for Vodafone are expected at some point.
Both the Nexus One and the newly-approved phone are being made by HTC. And while the name of the product in question isn't given, its model number is: 99110. The model number for the current version of Google's smartphone is 99100. These are so close its seems very likely they are from the same series.
Versions running on Verizon's CDMA air interface and also for Vodafone are expected at some point.
Both the Nexus One and the newly-approved phone are being made by HTC. And while the name of the product in question isn't given, its model number is: 99110. The model number for the current version of Google's smartphone is 99100. These are so close its seems very likely they are from the same series.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, January 31, 2010
Fundamental Changes to PSTN: What Would You Do?
Legacy regulation doesn't make much sense in a non-legacy new "public switched network" context. Nor do legacy concepts work very well for a communications market that changes faster than regulators can keep pace with, both in terms of technology and the more-important changes of business model.
In a world of loosely-coupled applications, old common carrier rules don't make much as much sense. Nor is it easy to craft durable rules when rapid changes in perceived end user value, which relate directly to revenue streams, are anything but stable.
Consider the public policy goal of ensuring a ubiquitous, broadband networking capability using a competitive framework, to promote the fastest rate of application creation and development, under circumstances where the government has neither the financial resources nor ability to do so.
The typical way one might approach the problem is regulate intramodally, looking at wired access providers as the domain. The other way might be to regulate intermodally, comparing all broadband access providers, irrespective of the network technology.
Then consider how a major broadband provider might look at the same problem. No wired services provider, as a practical matter, is allowed for reasons of antitrust to serve more than about 30 percent of total potential U.S. customers. Mobile providers are allowed, indeed encouraged, to serve 100 percent of potential customers, if possible.
Would a provider rationally want to invest to compete for 30 percent of customers on a landline basis, or 100 percent, using wireless?
Ignoring for the moment the historically different regulatory treatment of wired networks and wireless networks, in the new historical context, is it rational to spend too much effort and investment capital chasing a 30-percent market opportunity, or is it more rational to chase a 100-percent market opportunity?
Granted, network platforms are not "equal." Satellite broadband networks have some limitations, both in terms of potential bandwidth and network architecture, compared to wired networks.
Mobile networks have some advantages and disadvantages compared to fixed networks. Mobility is the upside, spectrum limitations impose some bandwidth issues. But fourth-generation networks can deliver sufficient bandwidth to compete as functional substitutes for many fixed applications.
Verizon has already stated that they're going to launch LTE at somewhere between 5 and 12 Mbps downstream. LTE theoretically is capable of speeds up to 80 Mbps, but that assumes lower subscriber demand and also low distance from towers.
The point is simply that discussions about national broadband frameworks will have to open some cans of worms. It is a legitimate national policy goal to foster ubiquitous, high-quality broadband access.
It may not be equally obvious that the best way to do so is to impose "legacy" style regulations that impede robust mobile capital investment and business strategies. That isn't to discount the value of fixed broadband connections. Indeed, broadband offload to the fixed network could play an invaluable role for mobile providers.
Still, aligning policy, capital investment and business strategy will be somewhat tricky.
In a world of loosely-coupled applications, old common carrier rules don't make much as much sense. Nor is it easy to craft durable rules when rapid changes in perceived end user value, which relate directly to revenue streams, are anything but stable.
Consider the public policy goal of ensuring a ubiquitous, broadband networking capability using a competitive framework, to promote the fastest rate of application creation and development, under circumstances where the government has neither the financial resources nor ability to do so.
The typical way one might approach the problem is regulate intramodally, looking at wired access providers as the domain. The other way might be to regulate intermodally, comparing all broadband access providers, irrespective of the network technology.
Then consider how a major broadband provider might look at the same problem. No wired services provider, as a practical matter, is allowed for reasons of antitrust to serve more than about 30 percent of total potential U.S. customers. Mobile providers are allowed, indeed encouraged, to serve 100 percent of potential customers, if possible.
Would a provider rationally want to invest to compete for 30 percent of customers on a landline basis, or 100 percent, using wireless?
Ignoring for the moment the historically different regulatory treatment of wired networks and wireless networks, in the new historical context, is it rational to spend too much effort and investment capital chasing a 30-percent market opportunity, or is it more rational to chase a 100-percent market opportunity?
Granted, network platforms are not "equal." Satellite broadband networks have some limitations, both in terms of potential bandwidth and network architecture, compared to wired networks.
Mobile networks have some advantages and disadvantages compared to fixed networks. Mobility is the upside, spectrum limitations impose some bandwidth issues. But fourth-generation networks can deliver sufficient bandwidth to compete as functional substitutes for many fixed applications.
Verizon has already stated that they're going to launch LTE at somewhere between 5 and 12 Mbps downstream. LTE theoretically is capable of speeds up to 80 Mbps, but that assumes lower subscriber demand and also low distance from towers.
The point is simply that discussions about national broadband frameworks will have to open some cans of worms. It is a legitimate national policy goal to foster ubiquitous, high-quality broadband access.
It may not be equally obvious that the best way to do so is to impose "legacy" style regulations that impede robust mobile capital investment and business strategies. That isn't to discount the value of fixed broadband connections. Indeed, broadband offload to the fixed network could play an invaluable role for mobile providers.
Still, aligning policy, capital investment and business strategy will be somewhat tricky.
Labels:
broadband,
business model,
regulation
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Apple is Now a Mobile Company
The iPhone now is Apple's biggest business, and it was a "zero" revenue contributor three years ago. Where Apple had fourth-quarter 2009 Mac revenue of $4.5 billiion, it had iPhone revenue of $5.6 billion, up 90 percent year over year. The iPod contributed $3.4 billion in revenue.
Even if one assumes no Mac revenue is attributable to portable devices, iPhone and iPod revenue from fully mobile devices amounts to $9 billion out of a total $13.5 billion in quarterly revenue, or two thirds of total.
Even if one assumes no Mac revenue is attributable to portable devices, iPhone and iPod revenue from fully mobile devices amounts to $9 billion out of a total $13.5 billion in quarterly revenue, or two thirds of total.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Friday, January 29, 2010
Voice as a "Spice"
Consultant Thomas Howe describes the way voice can work in a new context by calling it the equivalent of "spice." In other words, it might often be the case that, within the context of an enterprise application, voice is a feature used to enhance a process, rather than a stand-alone function or application.
In that sense, click-to-call is an example. Most people would agree that is the case. What remains unclear, at least for service providers who will continue to make signficant revenue selling voice as a stand-alone service, is whether "spice" is a business for them, or not. In some cases, it will be; but in other cases it will not.
To the extent that spice can be an interesting revenue stream for service providers is whether they can figure out ways to combine traditional calling functions with enteprise application features that integrate "calling" with information relevant to the call, that is valuable to the enterprise and is worth paying for, from the corporation’s point of view.
Monetizing such "hard to replicate" data by combining it with voice is where telcos have a great opportunity to grow, says Howe. There are many areas where only telcos can deliver voice and have the information that will add value to the call, such as authentication, location, even availability.
The issue is that many other providers in the business ecosystem also have the ability to integrate such functions in new ways. Google and Apple, for example, may well be able to leverage "location" information without needing the assistance or permission of the service provider.
Still, it should be possible to create services that confirm a person is home to receive a delivery, or to assist in scheduling at-home or at-office appointments.
Identity authentication, more than simply location or "phone number" identity, might be useful for transactions as well.
In that sense, click-to-call is an example. Most people would agree that is the case. What remains unclear, at least for service providers who will continue to make signficant revenue selling voice as a stand-alone service, is whether "spice" is a business for them, or not. In some cases, it will be; but in other cases it will not.
To the extent that spice can be an interesting revenue stream for service providers is whether they can figure out ways to combine traditional calling functions with enteprise application features that integrate "calling" with information relevant to the call, that is valuable to the enterprise and is worth paying for, from the corporation’s point of view.
Monetizing such "hard to replicate" data by combining it with voice is where telcos have a great opportunity to grow, says Howe. There are many areas where only telcos can deliver voice and have the information that will add value to the call, such as authentication, location, even availability.
The issue is that many other providers in the business ecosystem also have the ability to integrate such functions in new ways. Google and Apple, for example, may well be able to leverage "location" information without needing the assistance or permission of the service provider.
Still, it should be possible to create services that confirm a person is home to receive a delivery, or to assist in scheduling at-home or at-office appointments.
Identity authentication, more than simply location or "phone number" identity, might be useful for transactions as well.
Labels:
hosted VoIP,
IP telephony,
Voice 2.0
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...