Saturday, July 10, 2021

Methodology Matters

Most of us--at least when it suits our purposes--believe decision making is enhanced by the availability of good data. And most of us likely would agree that methodology matters when gathering data. 


So notes Ookla in reviewing data on broadband speeds described in a recent report.  “Our concern with the rest of the report is that the network performance test results the report was derived from painted an inaccurate picture of what constituents were actually experiencing in the district.”


“The results presented greatly underestimated the speeds being delivered by the service providers throughout most of the study area while overestimating some others,” said Ookla, which compared its own data with that supplied by M-Lab in the report. 


“The speeds measured by Speedtest for the same areas and the same time period are dramatically higher in most areas, indicating that additional infrastructure investments are unnecessary where constituents can already achieve network speeds that meet FCC minimums,” said Ookla. 


There is more than one way to calculate an average.  The “mean” average is the sum of all measurements divided by the number of records used. “This number is valuable, but it can be influenced by a small portion of records that may be extremely high or low (outliers),” said Ookla. “As fiber is installed within an area, a significant number of tests from ultra-high-speed connections can skew mean averages up.”


The opposite also can occur. “M-Lab vastly under-reported the network throughput in every single ZIP code represented in the congressional report,” Ookla said. 


“The ZIP code showing the least amount of difference by percentage between Ookla and M-Lab data was 13803 (Marathon) where M-Lab’s recorded median was 5.5 Mbps and the median from Ookla data was 14.5 Mbps,” Ookla noted. “So the typical speed in Marathon measured by Ookla’s Speedtest was over two and a half times as fast as the average measurement captured by M-Lab.”


“On the other end of the scale, in Whitney Point, M-Lab’s recorded median was 0.9 Mbps while Ookla measured a median of 71 Mbps, almost eighty times faster,” the firm said. 


“It is clear from these results that M-Lab’s performance test does not measure the full capacity of a network connection and thus does not accurately reflect the real-world internet speeds consumers are experiencing,’ said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” Ookla added.


No comments:

It Will be Hard to Measure AI Impact on Knowledge Worker "Productivity"

There are over 100 million knowledge workers in the United States, and more than 1.25 billion knowledge workers globally, according to one A...