Earlier this month, Ookla, the company responsible for Speedtest.net (one of the two online speed tests available on the FCC’s website Broadband.gov), started publishing a “Household Promise Index” that is designed to measure the gap between actual and advertised “up to” broadband speeds. The results? On average, Ookla reports that U.S. consumers are receiving roughly 93% of the advertised speeds on the tiers to which they subscribe. And in many regions, that figure exceeds 100% – i.e., customers are getting faster speeds than the “up to” speeds they’ve signed up for. The Ookla data may help explain why a recent FCC survey found that over 90% of consumers are happy with the broadband speeds they’re receiving.
That sounds like pretty good news – networks are performing by and large as they should and, as a result, consumers are happy. It certainly is welcome (although not surprising) news to NCTA’s members – and should be to policymakers as well. With the FCC and others devoting significant attention to promoting broadband adoption, data that demonstrates the fundamental value proposition of broadband should be tremendously helpful in overcoming the reluctance that some Americans continue to feel about signing up for broadband service.
The new Ookla data provides a sharp contrast to the National Broadband Plan’s estimate, based on data from a company named comScore, of a “50% gap” between actual and advertised broadband speeds. Without going into too much detail (those interested can follow this link), the comScore data was flawed on both ends of the equation. It overestimated the speed of the “service tiers” to which consumers actually subscribed and also underestimated the speeds that consumers actually receive – thereby creating an alleged “gap” between advertised and delivered performance that lacks any sound factual basis. A recent MIT study confirmed that there are a number of “potentially significant sources of measurement error” that caution against using the comScore data to reach any conclusions about ISP service performance.
Admittedly, the Ookla data is not a perfect measure of network performance either. The Ookla data, like the comScore data, are based on user-generated speed tests and suffer from some of the same weaknesses. Both systems measure the long and winding road from a consumer’s computer to a test server somewhere on the Internet. Although ISPs control only a portion of that road, speeds can be impeded anywhere along the route (e.g., within the home computer, the home network, or on the open Internet). But while the Ookla and comScore data share some of the same weaknesses, the MIT study found that “the Ookla/Speedtest test methodology is more likely than the other tests we examine to correspond to the speed of an access link for common network usage patterns.”
To its credit, the FCC has contracted with a company called SamKnows to conduct a hardware-based test that should eliminate many of the problems associated with online speed tests. NCTA and many of our member companies have been working closely with SamKnows and the FCC staff on that testing process, which is expected to begin in the near future.
But as we wait for the collaborative development of even better measures of broadband performance, it’s worth noting Ookla’s independent assessment and the continuing efforts of cable ISPs to meet their customers’ expectations.
NOTE: Steven Morris is Vice President and Associate General Counsel for the National Cable & Telecommunications Association. If you are interested in this topic, you may want to read these related posts:
- Glass 95% Full? The Broadband Report’s Mixed Bag
- Consumers Note Broadband Satisfaction
- Measuring the Speed of Value