A new survey suggests that about 99 percent of available BitTorrent content violates copyright laws, says Sauhard Sahi, a Princeton University student who conducted the analysis.
Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.
That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.
The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.
The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or user-generated content.
By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.
The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.
In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.
All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.
"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.
"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."
Showing posts with label P2P. Show all posts
Showing posts with label P2P. Show all posts
Tuesday, February 2, 2010
99% of BitTorrent Content Illegal?
Labels:
BitTorrent,
network neutrality,
P2P,
regulation
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, January 10, 2010
FCC has No Current Authority to Impose Network Neutrality Rules?
The U.S. Federal Appeals Court reviewing whether the Federal Communications Commission currently has authority to create or enforce "network neutrality" rules has not yet ruled.
But initial questioning suggests the court questions whether the Federal Communications Commission has current jurisdiction to write, much less enforce, net-neutrality rules for the Internet. So some legal observers now suggest the appeals court will in fact rule that the FCC had not authority to sanction Comcast for the way it managed peer-to-peer services.
A 2008 FCC order forced Comcast to stop throttling BitTorrent applications as a means of managing network congestion.
U.S. Court of Appeals for the District of Columbia Circuit Jude Raymong Randolph pointed out to an FCC attorney that “you have yet to identify a specific statute.”
Since the Congress has passed no laws relating to network neutrality, the FCC had, and has, no authority to take action on the matter, the judge seems to suggest.
A ruling of that sort would at least temporarily delay any new efforts by the FCC to codify new network neutrality rules, and shift the battle over such rules to the Congress.
FCC Chairman Julius Genachowski has argued the agency has authority to set net neutrality rules because of the "Internet Freedoms Principles" set in 2005, which say that users have the right to use lawful applications, which P2P is, though the use of P2P sometimes includes transfers of copyrighted content without permission.
But Comcast argues it has the right to manage its network, which it interprets as permitting rate limiting of P2P services, when necessary to preserve user experience and relieve congestion.
To be sure, the specific issue at hand seems primarily about whether the FCC’s decision was improper for statutory reasons, as Congress has not given the FCC legislative permission to create such rules, observers say.
On a wider legislative front, some observers think the White House is dialing back its efforts to get "strong" network neutrality rules adopted. The evidence is indirect, but some point to the late-October resignation of of Susan Crawford, University of Michigan law professor, previously a key adviser to the president on technology and communications, and a proponent of "strong" network neutrality rules.
According to the American Spectator, Crawford's version of Net neutrality was too radical for White House economic adviser Lawrence Summers, contributing to her early departure. If that observation is correct, it would be a sign that any new rules would not strictly ban "every" form of packet prioritization.
Many observers note that quality of service measures typically are needed when users want to interact with important video or voice services, especially as video already has become the primary driver of bandwidth consumption on a global level.
Those observers also would note that strict versions of net neutrality, that would absolutely ban any packet prioritization, would prevent Internet access providers from applying prioritization on behalf of their users, even when those users might specifcially ask for, and desire, such prioritization.
"Packet discrimination" sounds bad, and is, when it is used as a business weapon, allowing unfair competition. But packet discrimination is a good thing when it helps maintain quality of experience for the emerging applications users say are important, especially video and voice.
Also, at the recent Consumer Electronics Show, White House deputy CTO Andrew McLaughlin said the FCC had yet to determine whether Net neutrality is needed to preserve the "open Internet."
If that seems unremarkable, consider that in 2009 McLaughlin had said network management practices of cable companies that limited the speeds of large file downloads were essentially the same thing as Chinese-style Internet censorship.
Management of bandwidth-heavy applications by some users at times of network congestion is not application "blocking" or censorship. It is an effort to maintain quality of service for most users. Some methods will be more palatable than others.
The analogy is access to the old voice network. Telcos do not "censor" speech when, at times of peak load, a user might encounter a "fast busy" signal indicating that no circuits are available. The point is that every network gets congested at least some of the time.
And it always has been recognized that some method of regulating access at such times is a legitimate network management matter. In fact, a fast busy tone does mean a user has temporarily been "blocked" from the network. Sometimes a mobile voice call experiences the same sort of temporary blocking.
That sort of access blocking is not any suppression of freedom of communication or expression. It is not an infringement of Internet freedom. It is a simple way of managing a congested resource at times of peak load.
The immediate matter at hand, though, is the simple matter of legislatively-granted authority. The appeals court seems to be signaling its belief that Congress has granted the FCC no authority to impose rules about network congerstion management or methods of doing so.
But initial questioning suggests the court questions whether the Federal Communications Commission has current jurisdiction to write, much less enforce, net-neutrality rules for the Internet. So some legal observers now suggest the appeals court will in fact rule that the FCC had not authority to sanction Comcast for the way it managed peer-to-peer services.
A 2008 FCC order forced Comcast to stop throttling BitTorrent applications as a means of managing network congestion.
U.S. Court of Appeals for the District of Columbia Circuit Jude Raymong Randolph pointed out to an FCC attorney that “you have yet to identify a specific statute.”
Since the Congress has passed no laws relating to network neutrality, the FCC had, and has, no authority to take action on the matter, the judge seems to suggest.
A ruling of that sort would at least temporarily delay any new efforts by the FCC to codify new network neutrality rules, and shift the battle over such rules to the Congress.
FCC Chairman Julius Genachowski has argued the agency has authority to set net neutrality rules because of the "Internet Freedoms Principles" set in 2005, which say that users have the right to use lawful applications, which P2P is, though the use of P2P sometimes includes transfers of copyrighted content without permission.
But Comcast argues it has the right to manage its network, which it interprets as permitting rate limiting of P2P services, when necessary to preserve user experience and relieve congestion.
To be sure, the specific issue at hand seems primarily about whether the FCC’s decision was improper for statutory reasons, as Congress has not given the FCC legislative permission to create such rules, observers say.
On a wider legislative front, some observers think the White House is dialing back its efforts to get "strong" network neutrality rules adopted. The evidence is indirect, but some point to the late-October resignation of of Susan Crawford, University of Michigan law professor, previously a key adviser to the president on technology and communications, and a proponent of "strong" network neutrality rules.
According to the American Spectator, Crawford's version of Net neutrality was too radical for White House economic adviser Lawrence Summers, contributing to her early departure. If that observation is correct, it would be a sign that any new rules would not strictly ban "every" form of packet prioritization.
Many observers note that quality of service measures typically are needed when users want to interact with important video or voice services, especially as video already has become the primary driver of bandwidth consumption on a global level.
Those observers also would note that strict versions of net neutrality, that would absolutely ban any packet prioritization, would prevent Internet access providers from applying prioritization on behalf of their users, even when those users might specifcially ask for, and desire, such prioritization.
"Packet discrimination" sounds bad, and is, when it is used as a business weapon, allowing unfair competition. But packet discrimination is a good thing when it helps maintain quality of experience for the emerging applications users say are important, especially video and voice.
Also, at the recent Consumer Electronics Show, White House deputy CTO Andrew McLaughlin said the FCC had yet to determine whether Net neutrality is needed to preserve the "open Internet."
If that seems unremarkable, consider that in 2009 McLaughlin had said network management practices of cable companies that limited the speeds of large file downloads were essentially the same thing as Chinese-style Internet censorship.
Management of bandwidth-heavy applications by some users at times of network congestion is not application "blocking" or censorship. It is an effort to maintain quality of service for most users. Some methods will be more palatable than others.
The analogy is access to the old voice network. Telcos do not "censor" speech when, at times of peak load, a user might encounter a "fast busy" signal indicating that no circuits are available. The point is that every network gets congested at least some of the time.
And it always has been recognized that some method of regulating access at such times is a legitimate network management matter. In fact, a fast busy tone does mean a user has temporarily been "blocked" from the network. Sometimes a mobile voice call experiences the same sort of temporary blocking.
That sort of access blocking is not any suppression of freedom of communication or expression. It is not an infringement of Internet freedom. It is a simple way of managing a congested resource at times of peak load.
The immediate matter at hand, though, is the simple matter of legislatively-granted authority. The appeals court seems to be signaling its belief that Congress has granted the FCC no authority to impose rules about network congerstion management or methods of doing so.
Labels:
comcast,
FCC,
network neutrality,
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Wednesday, October 28, 2009
Real-Time Internet Traffic Doubles
Real-time entertainment has almost doubled its share of total Internet traffic from 2008 to 2009, while gaming has increased its share by more than 50 percent, says Sandvine. Real-time entertainment traffic (streaming audio and video, peer-casting, place-shifting, Flash video) now accounts for 26.6 percent of total traffic in 2009, up from 12.6 percent in 2008, according to a new analysis by Sandvine.
As the percentage of real-time video and voice traffic continues to grow, latency issues will become more visible to end users, and will prompt new efforts by Internet access providers to provide better control of quality issues not related directly to bandwidth.
One reason is that video downloads, for example, are declining in favor of real-time streaming. Downloaded content is less susceptible to latency and jitter impairments.
Traffic to and from gaming consoles increased by more than 50 percent per subscriber as well, demonstrating not only the popularity of online gaming, but also the growing use of game consoles as sources of “traditional” entertainment such as movies and TV shows, says Sandvine.
Gaming, especially fast-paced action games, likewise are susceptible to experience impairment caused by latency and jitter.
.The growth of real-time entertainment consumption also is leading to a decline of peer-to-peer traffic. At a global level, P2P file-sharing declined by 25 percent as a share of total traffic, to account for just over 20 percent of total bytes, says Sandvine.
The changes have key implications for ISPs and end users. One way to protect real-time service performance for applications such as voice, video, videoconferencing and gaming is to take extra measures to protect latency performance for such real-time applications. And that is where clumsy new network neutrality rules might be a problem.
Whatever else might be said, user experience can be optimized at times of peak congestion by prioritizing delivery of real-time packets, compared to other types of traffic that are more robust in the face of packet delay. File downloads, email and Web surfing are examples of activities that are robust in the face of congestion.
So it matters greatly whether ISPs can condition end user traffic--especially with user consent--to maintain top priority for streaming video, voice or other real-time applications when networks are congested. Enterprises do this all the time. It would be a shame if consumers were denied the choice to benefit as well.
As the percentage of real-time video and voice traffic continues to grow, latency issues will become more visible to end users, and will prompt new efforts by Internet access providers to provide better control of quality issues not related directly to bandwidth.
One reason is that video downloads, for example, are declining in favor of real-time streaming. Downloaded content is less susceptible to latency and jitter impairments.
Traffic to and from gaming consoles increased by more than 50 percent per subscriber as well, demonstrating not only the popularity of online gaming, but also the growing use of game consoles as sources of “traditional” entertainment such as movies and TV shows, says Sandvine.
Gaming, especially fast-paced action games, likewise are susceptible to experience impairment caused by latency and jitter.
.The growth of real-time entertainment consumption also is leading to a decline of peer-to-peer traffic. At a global level, P2P file-sharing declined by 25 percent as a share of total traffic, to account for just over 20 percent of total bytes, says Sandvine.
The changes have key implications for ISPs and end users. One way to protect real-time service performance for applications such as voice, video, videoconferencing and gaming is to take extra measures to protect latency performance for such real-time applications. And that is where clumsy new network neutrality rules might be a problem.
Whatever else might be said, user experience can be optimized at times of peak congestion by prioritizing delivery of real-time packets, compared to other types of traffic that are more robust in the face of packet delay. File downloads, email and Web surfing are examples of activities that are robust in the face of congestion.
So it matters greatly whether ISPs can condition end user traffic--especially with user consent--to maintain top priority for streaming video, voice or other real-time applications when networks are congested. Enterprises do this all the time. It would be a shame if consumers were denied the choice to benefit as well.
Labels:
consumer VoIP,
network neutrality,
online video,
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Wednesday, October 14, 2009
Peer-to-peer Wi-Fi: Bluetooth Killer?
A new peer-to-peer Wi-Fi specification sponsored by the Wi-Fi Alliance will enable Wi-Fi devices to connect to one another directly without joining a traditional home, office, or hotspot network.
The Wi-Fi Alliance expects to begin certification for this new specification in mid-2010 and products which achieve the certification will be designated "Wi-Fi CERTIFIED Wi-Fi Direct."
The specification can be implemented in any Wi-Fi device, from mobile phones, cameras, printers, and notebook computers, to human interface devices such as keyboards and headphones.
Significantly, devices that have been certified to the new specification will also be able to create connections with hundreds of millions of Wi-Fi CERTIFIED legacy devices already in use.
Devices will be able to make a one-to-one connection, or a group of several devices can connect simultaneously.
The specification targets both consumer electronics and enterprise applications, provides management features for enterprise environments, and includes WPA2 security. Devices that support the specification will be able to discover one another and advertise available services.
Wi-Fi CERTIFIED Wi-Fi Direct devices will support typical Wi-Fi ranges and the same data rates as can be achieved with an infrastructure connection, so devices can connect from across a home or office and conduct bandwidth-hungry tasks with ease.
Though some might fear the specification will damage sales of Wi-Fi access points, the new P2P networking technique seems more a threat to near-field standards such as Bluetooth. For some applications, such as file sharing, the extended Wi-Fi range will make it a better option than Bluetooth for public near-field communications, for example.
Such proximity marketing techniques sometimes are used to allow users to interact with electronic billboards, for example. P2P Wi-Fi ought to be easier to use, and also will have greater range.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Friday, March 14, 2008
More Online Video, More Managed P2P
Online video sites have delivered promising stats recently, says Compete.com analyst Aniya Zaozerskaya. Netflix’s WatchNow, which allows subscribers to any Netflix plan to watch full-length movies and TV episodes online from their collection, had 69 percent more people using the service this quarter as compared to last quarter.
Veoh.com, which allows users to view and share short YouTube-like videos as well as stream full-length TV show episodes, has grown from just under 1.5 million unique visitors one year ago to over six million in February 2008.
Hulu.com, a newer site offering both full-length movies and TV shows, including the most recent in-season episodes, also is gaining traction, she says.
Assuming peer-to-peer applications are deemed lawful, and therefore not to be blocked--and that seems a certainty--managed P2P services would seem to be poised for growth.
One reason P2P chews up so much bandwidth on service provider backbones is the unmanaged way P2P traditionally operates. Bits of content might be fetched from long distances when the same material actually resides on a user hard drive someplace local.
So far, it appears, managing P2P streams can reduce overall backbone network traffic by 60 percent or more, executives at Pando Networks and Verizon Communications say.
Network-aware versions of P2P that can fetch data from local sources rather than reaching far across the network, can help,in that regard.
Veoh.com, which allows users to view and share short YouTube-like videos as well as stream full-length TV show episodes, has grown from just under 1.5 million unique visitors one year ago to over six million in February 2008.
Hulu.com, a newer site offering both full-length movies and TV shows, including the most recent in-season episodes, also is gaining traction, she says.
Assuming peer-to-peer applications are deemed lawful, and therefore not to be blocked--and that seems a certainty--managed P2P services would seem to be poised for growth.
One reason P2P chews up so much bandwidth on service provider backbones is the unmanaged way P2P traditionally operates. Bits of content might be fetched from long distances when the same material actually resides on a user hard drive someplace local.
So far, it appears, managing P2P streams can reduce overall backbone network traffic by 60 percent or more, executives at Pando Networks and Verizon Communications say.
Network-aware versions of P2P that can fetch data from local sources rather than reaching far across the network, can help,in that regard.
Labels:
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Friday, January 11, 2008
Mobile VoIP Proliferates
One wonders how long mobile carriers will wait before launching their own lower-cost global calling plans. At some point they will. The only issue is how much market share they are willing to tolerate losing to VoIP providers before they counterattack. Raketu is the latest contestant in the business calling space, by virtue of its compatibility with RIM BlackBerry devices.
What is emerging now is the IP equivalent of "over the top long distance" calling plans that used to be prevalent in the U.S. market. Under such plans, created in large part for reasons of regulatory compliance, users selected one provider for local calling and then another provider for long distance. At one point, one could not select one's local voice carrier for that purpose.
So you see the business effect: a regulatory framework creates an entire "long distance calling" business. It lasts for a while, as competition knocks prices way down. Then, at some point, regulators decide markets are competitive enough to allow the local phone companies back into long distance.
And then the independent long distance industry collapses.
VoIP over mobile, indeed VoIP itself, is headed for such a day of reckoning, at least for that portion of its use as a substitute for landline or wireless calling. Nobody knows when the day will come. It might come carrier by carrier. But at some point, mobile and wired service providers are going to reach a point where it makes sense to offer much-lower global calling from their existing services and devices.
That isn't to say independents will not gain share and build businesses in the short term. Nor is it to say VoIP features embedded into other experiences are likewise susceptible to telco repositioning and pricing. It is to say that past telco responses to regulatory and technologiccal change offer some obvious clues about what they will do in the future.
As scale players, they tend to ignore new threats and markets until some critical mass or clear strategic interest emerges. Then they move, and fairly quickly. They'll do so again.
What is emerging now is the IP equivalent of "over the top long distance" calling plans that used to be prevalent in the U.S. market. Under such plans, created in large part for reasons of regulatory compliance, users selected one provider for local calling and then another provider for long distance. At one point, one could not select one's local voice carrier for that purpose.
So you see the business effect: a regulatory framework creates an entire "long distance calling" business. It lasts for a while, as competition knocks prices way down. Then, at some point, regulators decide markets are competitive enough to allow the local phone companies back into long distance.
And then the independent long distance industry collapses.
VoIP over mobile, indeed VoIP itself, is headed for such a day of reckoning, at least for that portion of its use as a substitute for landline or wireless calling. Nobody knows when the day will come. It might come carrier by carrier. But at some point, mobile and wired service providers are going to reach a point where it makes sense to offer much-lower global calling from their existing services and devices.
That isn't to say independents will not gain share and build businesses in the short term. Nor is it to say VoIP features embedded into other experiences are likewise susceptible to telco repositioning and pricing. It is to say that past telco responses to regulatory and technologiccal change offer some obvious clues about what they will do in the future.
As scale players, they tend to ignore new threats and markets until some critical mass or clear strategic interest emerges. Then they move, and fairly quickly. They'll do so again.
Labels:
business VoIP,
mobile VoIP,
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Tuesday, January 8, 2008
FCC to Look at Traffic Shaping
The Associated Press says the Federal Communications Commission will investigate complaints that Comcast Corp. actively interferes with Internet traffic as its subscribers try to share files online.
This should be very interesting. One one hand, there's an issue about packet blocking. On the other hand there is an issue of exposure to copyright law, since much peer-to-peer traffic that Comcast and others appear to be blocking infringes copyright laws.
A coalition of consumer groups and legal scholars asked the agency in November to stop Comcast from discriminating against certain types of data. Two groups also asked the FCC to fine Comcast at a rate of $195,000 for every affected subscriber.
It is possible there are two intertwined issues here: packet blocking and copyright violations. The former might be technologically necessary to prevent the latter.
Labels:
FCC,
network neutrality,
P2P
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Friday, August 31, 2007
Defanged Skype
For all the fear Skype and other IM-based and peer to peer voice applications and services have created in the broader service provider industry, Skype seems to have crested. Skype still has lots of registered users, but they don't seem to be calling and using Skype chat as much as they used to.
Remember the concern municipal Wi-Fi networks raised just two years ago? Telcos and cable companies were worried muni Wi-Fi would cannibalize cable modem and Digital Subscriber Line services. And dare we even mention Vonage and other independent VoIP providers.
In fact, the only threat that really has materialized is cable companies. At least in North America, cable companies have emerged as the most serious threat to wireline voice and broadband Internet access revenue streams. Everything else essentially has remained a flea bite.
On the video and audio content side, remember the hackles BitTorrent and Kazaa raised? Now we have iTunes, Joost and a legal BitTorrent working with content owners.
So what conclusions should one draw from all of this? Probably that "disrupting" powerful incumbents is going to be much harder than attackers once had believed. Bandwidth exchanges thought they'd reshape interconnection. Competitive local exchange carriers thought they'd capture a goodly portion of the wireline voice market. Independent DSL providers thought they'd catch the telcos sleeping. Internet Service Providers thought the same about dial-up.
Turns out incumbents have more resiliency than anybody might have thought.
Remember the concern municipal Wi-Fi networks raised just two years ago? Telcos and cable companies were worried muni Wi-Fi would cannibalize cable modem and Digital Subscriber Line services. And dare we even mention Vonage and other independent VoIP providers.
In fact, the only threat that really has materialized is cable companies. At least in North America, cable companies have emerged as the most serious threat to wireline voice and broadband Internet access revenue streams. Everything else essentially has remained a flea bite.
On the video and audio content side, remember the hackles BitTorrent and Kazaa raised? Now we have iTunes, Joost and a legal BitTorrent working with content owners.
So what conclusions should one draw from all of this? Probably that "disrupting" powerful incumbents is going to be much harder than attackers once had believed. Bandwidth exchanges thought they'd reshape interconnection. Competitive local exchange carriers thought they'd capture a goodly portion of the wireline voice market. Independent DSL providers thought they'd catch the telcos sleeping. Internet Service Providers thought the same about dial-up.
Turns out incumbents have more resiliency than anybody might have thought.
Labels:
BitTorrent,
cable modem,
DSL,
IM,
P2P,
Skype,
VoIP
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, August 19, 2007
BitTorrent Throttled by Comcast
Internet Service Providers don't like BitTorrent because it basically destroys their business model (flat rate access) and stresses the very part of their network most vulnerable to high usage (the upstream). Many ISPs simply limit the available bandwidth for BitTorrent traffic. Cable operators that now seem to include Comcast go a bit further and disupt the "seeding" process that allows BitTorrent peers to act as better upload nodes. In Canada, Cogeco and Rogers Cablesystems also "step on" BitTorrent traffic.
If P2P traffic keeps growing the way Cisco predicts, and if no changes are made in the dominant retail pricing model, throttling of P2P applications will happen on a wider scale. P2P attacks network capacity at its weakest link.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Cisco Predicts Exabyte Networks
Cisco's recent forecast of global IP bandwidth consumption suggests a 37 percent cumulative average growth rate between 2006 and 2011, or about five times the 2006 level. That's aggressive, but you might expect that. You might even have expected the prediction that consumer usage will outstrip business usage, though business dominates at the moment. You wouldn't be surprised at all to learn that video will drive overall global usage.
You wouldn't necessarily be surprised to learn that Cisco forecasts at least 60 percent of all traffic will be commercial video delivered in the form of walled garden services. And a significant percentage of the remaining 40 percent of IP bandwidth will be consumed by IP-based video applications.
The next network, in other words, will be a video network that also carries voice and non-real-time data.
That would be a stunning change from the originally envisioned view of the Internet. But I think we have to recognize at this point that virtually none of the key developments in communications technology have developed as industry insiders, public policy proponents, technologists or entrepreneurs had supposed.
To be sure, all of the diligent work on Session Initiation Protocol will have a significant payoff. But that didn't stop Skype by rocketing past SIP using a proprietary approach.
The Telecommunications Act of 1996 was supposed to lead to an explosion of innovation by dismantling restrictions on "who" could be a provider of Class 5 switch services. Instead, innovation came from the Web. Perhaps despite the Telecom Act, all sorts of innovation has happened.
VoIP was supposed to transform the nature of communications. Instead, mobility, instant messaging and social networks are doing so. One might arguably look to all manner of text communications as the disruptive communications development of the past several decades, not voice.
And then there's electronic numbering and voice peering. Perhaps these approaches still will have some dramatic impact on global voice communications prices and ability to circumvent the "public network." But it's starting to look as though ENUM might be a next generation to provide the signaling system 7 function. That's not to say it is unimportant: only to say it was not what many had intended or expected.
So far, it would seem that the most disruptive impact of the whole basket of new technologies has been to disrupt our ability to predict the future. We've been wrong more than right, as we always are. IP networks are not now, and never will be, as closed as the old public network was. Neither are IP networks going to be "open," any-to-any networks in the old manner, with no intelligence or policies operating in the core of the network.
Lots of things can, and should, be done "at the edge." But increasingly, lots of things cannot. The transition of the global IP network to video also means a shift to real time services (and we aren't even talking about the same process at work for voice and visual collaboration). That spells the end of the completely "dumb network."
You wouldn't necessarily be surprised to learn that Cisco forecasts at least 60 percent of all traffic will be commercial video delivered in the form of walled garden services. And a significant percentage of the remaining 40 percent of IP bandwidth will be consumed by IP-based video applications.
The next network, in other words, will be a video network that also carries voice and non-real-time data.
That would be a stunning change from the originally envisioned view of the Internet. But I think we have to recognize at this point that virtually none of the key developments in communications technology have developed as industry insiders, public policy proponents, technologists or entrepreneurs had supposed.
To be sure, all of the diligent work on Session Initiation Protocol will have a significant payoff. But that didn't stop Skype by rocketing past SIP using a proprietary approach.
The Telecommunications Act of 1996 was supposed to lead to an explosion of innovation by dismantling restrictions on "who" could be a provider of Class 5 switch services. Instead, innovation came from the Web. Perhaps despite the Telecom Act, all sorts of innovation has happened.
VoIP was supposed to transform the nature of communications. Instead, mobility, instant messaging and social networks are doing so. One might arguably look to all manner of text communications as the disruptive communications development of the past several decades, not voice.
And then there's electronic numbering and voice peering. Perhaps these approaches still will have some dramatic impact on global voice communications prices and ability to circumvent the "public network." But it's starting to look as though ENUM might be a next generation to provide the signaling system 7 function. That's not to say it is unimportant: only to say it was not what many had intended or expected.
So far, it would seem that the most disruptive impact of the whole basket of new technologies has been to disrupt our ability to predict the future. We've been wrong more than right, as we always are. IP networks are not now, and never will be, as closed as the old public network was. Neither are IP networks going to be "open," any-to-any networks in the old manner, with no intelligence or policies operating in the core of the network.
Lots of things can, and should, be done "at the edge." But increasingly, lots of things cannot. The transition of the global IP network to video also means a shift to real time services (and we aren't even talking about the same process at work for voice and visual collaboration). That spells the end of the completely "dumb network."
Labels:
business VoIP,
Cisco,
Internet,
IP communications,
IP networks,
P2P,
video
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Tuesday, August 14, 2007
Yoomba Hits 500,000 Users
Yoomba Ltd. says it now has 500,000 uses since officially launching about a month ago.
Yoomba’s peer-to-peer application sits on top of every email network and turns any email address into a phone or instant messenger. Once Yoomba is activated buttons appear inside a user’s chosen email application, providing one-click access to talk to friends, family or colleagues around the world and on any network for free.
It works, though users may notice some slowdown of their email client. That, at least, is what seems to happen when Yoomba runs over Microsoft Office.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...