How fast is your Internet connection, really? How good is it, anyway? How can you tell?
The Canadian Radio-television and Telecommunications Commission (CRTC) is rightly interested in this question. So the CRTC contracted with SamKnows, a “global internet measurement and analysis platform”, to collect data in October 2019 on the performance of broadband Internet services sold to Canadian consumers. The results were published in a report, “Measuring Broadband Canada,” released in June 2020, at the tail of the “first wave” of COVID-19 in Canada. The outcome according to the CRTC?:
“Canadian consumers are receiving maximum advertised Internet speeds”. PIAC was suspicious.
The data were collected with “Whiteboxes”, which are hardware installed between a user’s device and their home modem or router to monitor broadband performance when no one in the home is using the Internet. You heard that right. When no one is using the Internet in the home.
Another important limitation of these Whiteboxes: measurements are only taken from the service provider’s location to the Whitebox, not within the user’s home network. You heard that one right too. Not accounting for your network setup, devices, or anything that uses or potentially slows down the Internet speed during customers actually “using” the Internet in a normal way.
Let’s give them the limited and “perfect” conditions, however. Let’s examine what they measured: the “performance indicators”. Those measured were: download and upload speeds; latency; packet loss; and webpage loading time. These are limited, but useful, indicators. However, other parameters could have been included – ones like “jitter”: a/k/a “packet delay variation”, (where variation in IP packet arrival at nodes in the Internet can cause packet loss and dropouts and interruptions, especially for a user’s voice and videocall applications – which are essential during the pandemic, whew!). Oh well.
The test results purported to show that all major Canadian ISPs are providing users with speeds meeting or exceeding the advertised speed, apparently to the point that users often were getting “additional” throughput, with very few instances of service falling below advertised speeds, for all performance indicators. Wow, this seems great.
The Report also claimed that speeds also did not decrease significantly during peak hours. Really?
Now we are suspiciouser.
We believe a closer examination of this claim reveals that, for a Report that claims to be “designed to provide accurate data on the broadband performance experienced by the majority of Canadian fixed-line broadband users,” the study is actually extremely limited in scope, and the conclusions drawn from the results are tone-deaf to the real-world usage context of Canadians. Perhaps this was made easier by the significant scaling down of the sample size and diversity of the measured connections, compared to a similar SamKnows study conducted in 2016. (This creates a risk of drawing conclusions from small sample sizes, or in short: the human cognitive bias to give too much credence to statistically insignificant results, called by behavioural economics scientist Daniel Kahneman the “law of small numbers”.)
However, most of the conclusions in this Report appear to rely upon what was chosen to be studied, and not on what was deliberately excluded, and that these scope reduction choices made by the Report authors were justified on technical bases but not on social or actual real-world use bases – the real world being the point of studying consumers’ experiences of their broadband service (and, we might add, the authors’ choices and limitations were implicitly endorsed by the CRTC, which uncritically announced the Report’s results with an industry-boosting News Release). We examine these critical limitations, and the sweeping conclusions reached, below.
The sample pool is heavily skewed towards higher tier plans and urban users
The first major limitation is in the service packages and demographics that the 2020 Report chooses to include, or rather, to exclude. The results were based on a pool of measurements from 2035 Whiteboxes deployed to customers of participating Internet Service Providers (ISPs), including the three largest ISPs: Bell; Rogers; and TELUS. Only Internet packages with the highest subscriber counts were included in the study in order to “represent the majority of Canadian fixed-line broadband users”. For comparison, the 2016 study used data from 3056 Whiteboxes, without the “highest subscriber count” condition.
The 2020 Report also excludes advertised download speeds of 10 Mbps or less, and packages that had less than 25 000 total subscribers. With few exceptions, sample sizes of less than 40 Whiteboxes per Internet package were excluded. The 10 Mbps cutoff is particularly concerning, as many rural Canadians only have these lower tier plans available to them. The study does not explain whether the exclusion of lower tier service packages was because of declining number of subscribers or otherwise. The exclusion is especially confounding considering the 2016 study did include Bell Canada’s 7/0.64 Mbps and 5/1 Mbps plans, and TELUS’ 6/1 Mbps plan, which respectively underperformed at 81%, 86%, and 81% of the advertised speeds. The CRTC’s choice to not re-evaluate these plans 3 years later calls into question whether the speed and quality of service has improved for Canadians still relying on these basic plans.
The lack of evidence for lower-tier plans does a disservice to rural Canadians, who tend to only have access to lower speed broadband Internet. Based on the most recent Communications Monitoring Report (CMR), released by the CRTC earlier this year, the broadband coverage in rural communities in 2018 was only 40.8% for broadband speeds of 50/10 Mbps (31.3% in First Nations reserves), compared to 97.7% coverage for urban areas. 1.5 Mbps broadband was available to rural communities at a much higher coverage rate of 94.0%, and yet the SamKnows study does not address whether these users are getting reliable service at speeds that are already inadequate for modern usage needs even at the advertised benchmark.
Another major difference between the 2016 and 2020 Reports is that the 2016 Report explicitly took measurements that “covered all geographic regions of Canada in a mix of urban and rural settings,” and acknowledged variations in results stemming from rural and remote measurements. The 2020 Report makes no such claims – because it cannot – in fact, it does not mention a rural sample portion at all. We can only assume, based on how the data collection is skewed towards higher-tier services, that effectively only urban and suburban users were included in the study. Furthermore, the study excluded Northwestel from the results for webpage loading times because “their remote location would have an adverse impact on results compared to other ISPs.” This should raise the eyebrows of anyone familiar with selection bias. A fairer presentation would have been to include this information from Northwestel and then to explain those data’s effect on the bottom line number with an explanatory note. In effect, the study cherry-picks results, limited to urban and suburban users, who typically enjoy greater reliability and more service choice than rural and remote users.
Collecting data during periods of inactivity only measures speed, not user experience
As we noted earlier, the “real-world” utility of Whitebox measurements is also limited by the fact that data are only collected when there is no end-user “cross-traffic” on the home network. In other words, the Whitebox only takes measurements when there is no one in the household actually using the Internet, apparently so that the “WhiteBox’s measurements are not “distorted” by end-user activity, and that the Whitebox’s measurement traffic does not interfere with the user’s experience of the Internet. Except, of course, the user’s actual household experience is always filtered by the fact that they must use their Internet connection, and some sort of consumer device, such as a smartphone, laptop, or connected TV to experience speed and to use the product, that is, the Internet. Therefore the study only measures potentially available speeds on a household network, not how efficiently and reliably those broadband speeds stand up to normal and indeed, human, user activity. The study qualifies that the Whiteboxes only measure speeds to the “doorstep” (Whitebox) because factors like the number of devices in use at the same time, faulty equipment, and poor Wi-Fi connectivity can affect broadband performance inside the home. Well, duh. We all live in the real world, with some “network overhead”: routers, WiFi, devices. Why can the study not allocate and take into account a “typical” such level of overhead?
It is precisely the real-world factors that, together with the “to the door” delivered speed and quality, to “make or break” the utility of an Internet service for a household, especially during peak hours. Without more comprehensive research that accounts for these factors, or at the least some allowance for consumers to live in the real world, with a real network and real Internet devices, the study should recognize its results for what they are: merely the potential maximum speeds “available” to a household.
The 2020 Report, however, makes the very much larger, and, in the real world, at least confusing claim that “Canadian ISPs have mostly met or exceeded maximum advertised download and upload speeds across tiers and regions,” despite the Report’s partial and theoretical, rather than real-world, basis. But wait, there’s more: the report extrapolates that “quality of service is consistent across Canada,” and that the results were based on “the broadband performance experienced by the majority of Canadian fixed-line broadband users.” Firstly, the effective exclusions of rural areas by concentration on higher-tier packages completely undermines the assertion that service is consistently up to snuff across Canada. Secondly, the Report, by its own methodology explicitly excludes any “consumer experience” at all – since only the Whiteboxes’ “experience” is measured, not the experience of a real consumer on a real network using a real device – so any claim of “performance experienced by … Canadian … users” is manifestly false. Lastly, the Report, despite the measurements being conducted prior to the COVID-19 pandemic (but released during it) now is of questionable utility in the real world context of the current pandemic. With more households working and going to school from home, resulting in longer peak periods and heavier traffic, more use of video and audio streaming and communications tools like videoconferencing, the need for faster and more reliable broadband is greater than ever.
Let’s park our cynicism and assume for a moment, however, that the majority of Canadians do in fact have access to high speed Internet, the issue during the pandemic and well before, for many consumers, is not speed, but affordability of Internet service. In rural communities, household spending for Internet services is increasing despite slow deployment of high speed Internet. From 2013 to 2017, average monthly Internet access spending increased for rural households from $37.42 to $54.83, a whopping 46.5% increase.
The CMR directly acknowledges that rural households spend more than urban households because of “slightly higher prices offered in rural areas, where there are typically fewer service providers.” Instead of a simple Report examining largely the highest service tiers for the most easily-served demographics, the CRTC should at the least supplement this Report with a study that helps resolve the accessibility and affordability issues that have persisted for years, especially for vulnerable and underserved Canadians.
Conclusion: What’s wrong, why it matters, and how it can be fixed
The 2020 Report is flawed. It presents an artificial “measurement” of selected networks, in selected locations, for selected users at selected speeds, in ideal conditions and a totally artificial context as far from “real world” Internet experience of users as we can imagine.
To then claim that Canadians’ experiences of the Internet are that it is fast is flatly wrong. It smacks of regulatory propaganda. We are tired of this approach from our telecommunications regulator.
At the very least, the CRTC should rethink its methodological approach to make the next report more comprehensive and reliable. The CRTC should rethink its communications regarding this report and similar reports prepared for the CRTC such as the even more recent “Secret Shopper Project Report” – which has its own limitations, as examined in PIAC’s “We Fight for That” podcast, episode 2.
Next up: Part 2 – Traffic Cops on the Internet – Broadband Speed Advertising in Canada and Abroad
In part 2 of our “Buying Speed? What Canadians Pay for Broadband” series, our next blog post focuses on broadband speed advertising. PIAC notes that other countries view the broadband speed question much more pragmatically than Canada, and require advertised speeds to correspond to lived experiences of average users, at average times under average network loads. Is Canada’s laissez-faire approach to this facilitating something like false advertising? You be the judge.