Showing posts with label jitter. Show all posts
Showing posts with label jitter. Show all posts

Tuesday, June 29, 2010

Is Bit Prioritization Necessary?

Network neutrality proponents, especially those supporting the "strong" forms such as an outright ban on any bit priorities, believe that next generation networks will have ample bandwidth to support all real-time services without the need for prioritizing mechanisms.

Users of enterprise networks might react in shock to such notions, as shared networks often encounter latency and bandwidth constraints that are overcome precisely by policy control. And despite increasing bandwidth on mobile networks, users and network service operators already know that congestion is a major problem.

And the evidence does not seem to support the notion that applications are not affected by congestion, or that use of two or more applications does not create externalities that impair real-time application performance.

"I measured my jitter while using Netflix (Jitter occurs when an application monopolizes a router’s transmit queue and demands that hundreds of its own packets are serviced before any other application gets a single packet transmitted) and found an average jitter of 44 milliseconds and a worse case that exceeds 1000 ms," says Ou.

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...