Back in 2016 Dutch telecom regulator ACM hired Google to measure speed of Internet, empowers consumers to make complaints based on bogus data
Dutch telecom regulator Authority for Consumers and Markets (ACM) announced back in 2016 the launch of an Internet speed tool based on MLab, a partnership of Google, New America Foundation, and Princeton University. ACM said such measurement is a requirement stipulated by the new EU net neutrality rules. Should the measurements not match the contracted speeds in the contract, the consumer is able to recover fees from the telecom operator. The measurement tool is not certified and the ACM says its cooperation is preliminary, but the ACM nevertheless encourages users to adopt the tool, which unwittingly sends bogus data to the regulator and users’ personal data to Google. This research note describes some of the problems of MLab measurement, the technical limitation of broadband providers’ ability to offer speed guarantees, and the moral hazard to which ACM exposes users.
Additionally Strand Consult describes the many challenges in measuring the quality of the mobile network. Its new report “The Moment of Truth – Why the Quality of Mobile Networks Differs” documents that various measurements based on mobile apps and crowdsourcing are worthless and do not provide an accurate picture mobile coverage.
MLab measurements are bogus
While there is an allure to boiling down the complexity of the Internet to a number that can be measured on an app (and for regulators’ purposes, another weapon to brandish against operators), the Internet is a complex global networks with many diverse connections. Web-based, crowdsourced network tools such as MLab do not measure network quality meaningfully. Using a web browser to characterize the network is akin to measuring the weather by putting a thermometer in the ground. There may be relevant information in the dirt, but this does not say whether it’s cloudy, sunny, or blustery. Operators spend billions of dollars to purchase equipment to measure their networks. If they could just hack an app instead, they would.
While empowering consumers with information is a laudable goal, the motivation of MLab is questionable. MLab is a marketing project for Google in which it assembles a set of network data, brands it with the veneer of academic credibility, and then deploys it in regulatory fora in service of its business goals. One objective of MLab is to pressure operators into overbuilding capacity. While this may serve Google’s business interests to make more dumb pipes for Google’s ad-enabled content, this is not necessarily the same goal of building networks in an economically sustainable and socially beneficial fashion.
Among the MLab suite is the Network Diagnostic Tool (NDT) which gained notoriety last summer when the activist group BattleforNet attempted to use MLab data in their Internet Health Test to prove that American broadband providers were purposely downgrading internet connections. An article in the Guardian touted the data, but was soon exposed as fake by Internet analyst Dan Rayburn. Network engineer and wifi co-founder Richard Bennett also exposed the scam, calling the BattlefortheNet’s sponsored Internet Health Test” junk.
The MLab performance and transparency suite consists of a variety of tests which approximate, infer, and ventures guesses. The potential for mismeasurement and misattribution of problems is enormous. MLab simply does not have administrative access and test points in all network devices within the great expanse of the Internet. Most likely, the only test points MLab has are a few regional nodes. Ultimately, MLab lacks comprehensive visibility of the collection of networks it is offering to test.
Here are just a few of the technical limitations of MLab:
1. Network packet tests cannot accurately attribute a particular link speed when measuring across a multitude of links
2. Overhead is variable depending on link speed and technology
3. Allocation of capacity dedicated to network management and control traffic is not known. Further, it is variable depending on technologies employed (such as different routing algorithms), exact spot in the network, general topology, topology configuration, and scale.
4. Switches typically don’t have IP presence info to send to ICMP packets (typically used for basic testing).
5. Load-balancing / multi-link paths may not be accurately measured.
6. Not all service provider routers respond to ICMP packets typically used to ping and traceroute.
7. All performance results are variable with network loads.
8. Some tests may pollute the measurement environment.
Simply put, MLab’s NDT does not measure what it purports, network performance. While the tool may report whether the sender, receiver or network operates properly, provide information about tuning, and make suggestions to improve some functions; it does not report where the problem is or how other servers or clients perform. For example, it notes that its bottleneck test “calculates the inter-packet throughput which, on average, should correspond to the bandwidth of the lowest-speed link,” but this would need a definition of ”inter-packet throughput”, which is not supplied. As Strand Consult’s report describes, some 70% of a mobile user’s bad network experience is related to the device, but this is not accounted for appropriately in MLab’s measurements.
Mathematician and network scientist Martin Geddes explains why such tools as MLab won’t work for regulators hoping to ferret out some hypothesized bad behavior. Indeed UK regulator Ofcom won’t use them. Niel Davies of Predictable Network Solutions, hired by Ofcom to study a set of traffic management tools, describes why such tools can’t detect, let alone, measure certain network behaviors.
But a dearth of facts has never gotten in the way of the ACM’s objectives. This is evident to its lip service to taking a “problem-based approach” but its insistence that certain things just need to be regulated, even though it declines to provide real world evidence for its regulation on network access and net neutrality. This was the takeaway in a speech by ACM Chair Chris Fonteijn to the International Bar Association conference, the annual event where top regulator communicate to the legal profession, essentially telling them how hard they will hammer telecom operators. For lawyers it is an opportunity to gauge their future revenue in fending off the regulatory onslaught.
Mobile operators cannot today provide end-to-end performance characteristics and without prioritization or other QoS tools, they never can. The Internet is a statistically multiplexed networking system; in other words, it is based on shared-capacity packet routing. Mobile broadband providers sell customers an approximate connection speed from the customer’s device to the broadband provider’s network boundary. They do not sell end-to-end Internet speed, because they can’t.
What ACM claims that the EU law guarantees, end-to-end speed performance – is something ISPs cannot sell and, without paid prioritization, will never be able to sell. A broadband provider does not control the entire Internet and thus has no say in how other networks run, the original packet scheduling of the data they receive, and the packet streams that must navigate through its networks.
BEREC, ACM in bed with Google
The kicker to the ACM’s measurement program is that it subjects users to Google tracking. Users’ personal data is collected and put in a public database operated by Google. Clearly this regulator which has no problem with the hypocrisy of producing a consumer-education video called “Every App Has Its Price” (a satire of ad-supported apps in which free coffee is offered in exchange for personal data, illustrating that free is not free) and then contracting with Google to measure individuals’ network experience, unwittingly subjecting people to Google tracking.
Strand Consult has back in 2016 described how the Body of the European Regulators for Electronic Communications (BEREC) has been co-opted by Google and its activists. The ACM is a leader in the process, hosting 120 professionals from 30 countries to make the net neutrality guidelines and inviting Google’s key advocate Barbara van Schewick to speak. But the minutes, documents, and attendees from that meeting are still not available.
Strand Consult has over the years made repeated requests of BEREC to make their proceedings transparent to no avail. As our report documents, at least 7 of the 14 official stakeholders of BEREC’s net neutrality proceeding have Google as a member or sponsor, most notably represented in the “civil society” category. A number of observers have described the worrisome effects of the Googlement and the revolving door between EU governments and Google, noting 80 instances when professionals switched jobs between the two. Such efforts appear to pay off for Google, as it has succeeded to implement net neutrality rules which protect its market share.
That ACM uses Google tools to measure a policy pushed by Google and to further other policies desired by Google seems to be a conflict of interest. At the very least, the measurement tools employed by the telecom regulator should be independent, and regulators should know how networks function and what can be measured. The new report “The Moment of Truth – Why the Quality of Mobile Networks Differs” debunks in a simple way the myths of measuring mobile coverage. The goal that this report is to inform operators of the misinformation and prepare them for how to push back in debate on coverage.