How well is 5G shaping up?
Enthusiasm continues to run high for 5G. Many believe that it is a revolutionary generation, delivering a massive change in connectivity and revolutionising our lives. Much of this enthusiasm is based on the design aims of 5G which were generally seen as:
- 1,000 times increase in mobile data volumes.
- 10 to 100 times increase in connected devices.
- 5 times lower latency.
- 10-100 times increase in data rate.
- 10 times battery life extension for low power devices.
- 10x less power consumption.
If 5G delivers on these promises it will certainly be a step change in the capabilities of our mobile networks. Whether this will lead to developments “more significant than the advent of electricity or the automobile” as some claim depends on what the capabilities are used for.
With 5G now deployed and in operation for over a year, we can start to assess whether it is likely to live up to these claims. Of course, it is still early days, so we should not expect 5G to meet these design goals yet, but we can start to chart a projection as to where 5G might end up.
Note, the discussion that follows assumes that the mmWave spectrum (26-28GHz) is not widely deployed for mobile 5G (as opposed to FWA). Were that to change then many of the arguments below would no longer hold.
1,000 times increase in mobile data volumes.
The hope was that 5G would be about 10x more spectrum efficient than 4G and would be coupled with a huge and extensive deployment of small cells – around 100-fold more than existing cells – that would collectively deliver the 1,000x increase in capacity.
The reality has been quite different. 5G operators have not, generally, deployed many small cells, but instead have installed 5G on existing cell sites. This, in turn, has meant that the higher frequencies used for 5G (3.5GHz compared to 800MHz for 4G) have resulted in low 5G signal levels across the cells, which make the advanced antennas used in 5G less efficient, or even of no benefit. It is hard to be sure, but some modelling suggests that 5G in such a deployment is probably only 50% (1.5x) more efficient than 4G. This is actually quite an achievement – 4G was already finely tuned to very high levels of spectrum efficiency and 5G is heavily penalised by its higher frequency spectrum.
We could, perhaps, credit the 5G initiative with delivering more spectrum (although, of course, this could have been delivered to 4G as well, and in the US some of the deployments in this band are actually 4G). If we look at downlink spectrum and assume TDD spectrum is 80% downlink, then 5G adds roughly 310MHz of spectrum to operators who already have 360MHz of spectrum (this varies country-by-country). If we then multiply out the additional spectrum at 1.5x more efficiency then we get 2.3x more capacity across the cellular network.
Might that improve? Only if the deployment methodology changes to one of small cells, and there is little evidence of that happening in most countries. Indeed, 5G does nothing to make small cells more economic or simpler to deploy, and small cells have not been a key feature of 4G, so it is hard to see massive small cell deployment any time soon.
Verdict: Promise of 1,000x more capacity, delivery of 2.3x. Massive miss.
10 to 100 times increase in connected devices.
This one can be dealt with quite simply. Despite all the promises, 5G has no new way of connecting the sort of devices that would exist in massive volume – the sensors and simple actuators. Instead, it takes the 4G IoT solution – NB-IoT and rebrands it as 5G. This is quite staggering – IoT was meant to be one of the three pillars of 5G and yet it has been completely neglected.
R17 might add a mid-tier IoT solution with capability between NB-IoT and the ultra-low latency high-end solutions, but this would not be used for the very large device numbers. And to be fair, NB-IoT was added to 4G late in its evolution so perhaps we need to wait a few more years to judge 5G.
In fact, there is no need for a big increase in device connectivity. IoT has grown much more slowly than anticipated – predictions made in 2010 were that there would be 50 billion devices by now, but there are more like 8 billion. We are unlikely to exceed 30 billion by 2030, most of which would be connected via Wi-Fi or similar, and that is well within the capabilities of existing NB-IoT systems.
Verdict: 10 – 100x promised. 1x delivered. Massive miss that undermines a key rationale for 5G.
5 times lower latency.
A key element of 5G is lower latency, potentially enabling a raft of new applications that have been labelled the “tactile internet”. 4G typically delivers latencies of about 40-50ms, so to hit this target 5G would have to be under 10ms. Current 5G deployments are little better than 4G, but that is to be expected as the essential stand-alone 5G core is not yet ready. We will need to wait a little longer before we can tell whether 5G will deliver on this objective.
Verdict: Too early to say
10-100 times increase in data rate.
This is very difficult to determine because there is no “benchmark” 4G data rate. Typically, a good 4G network will deliver around 20-40Mbits/s on average across all subscribers, setting the target for 5G at, say 300 – 3000Mbits/s. Such rates have been achieved in the laboratory, but we are interested in the real-world and what consumers experience. In situations where they would get, say, 30Mbits/s for 4G what will they get for 5G?
OpenSignal and other crowd-sourcing companies have been measuring this, and the answer, to date, is that someone with a 5G phone will get blended 4G/5G data rates (falling back to 4G where there is no 5G coverage) broadly little different from 4G. There are exceptions – Saudi Arabia appears to be delivering 5x data rates. This may improve when more spectrum is added and if deployments become more extensive. Equally, it may get worse as more users load the network, especially if it is also used for FWA.
Verdict: 10-100x promised, 5x delivered. Fair effort, but not transformational.
10 times battery life extension for low power devices.
Since the lower power devices will use NB-IoT, the 4G solution then quite simply, there is no change here.
Verdict: 10x promised, 1x delivered.
10x Lower energy consumption
5G base stations turn out to be very power hungry, mostly due to the complex antennas deployed. This is causing deployment headaches as increased power supplies and thicker power cable have to be installed into existing cell sites. 5G is certainly not reducing overall power consumption – far from it. A typical 5G base station increases power consumption over a 2G/4G base station by around 70%[1].
But it also increases capacity. So what happens on a “per bit” basis? Roughly, 5G becomes about 1.3x more efficient. This depends hugely on deployment approaches, how many carriers, what type of antennas, etc. This seems unlikely to change much, although perhaps 5G antennas might become more efficient over time.
Verdict: 10x promised, 1.3x delivered. Not very green.
Summary
The table below summarises where it looks like 5G might get to, based on current experience and compares it with what 4G delivered (compared to 3G).
Parameter | 4G delivered over 3G | 5G might deliver over 4G |
Mobile data volumes | 3x | 3x |
Connected devices | 100x | 1x |
Latency | 2.5x | Unclear, up to 5x possible |
Effective data rate | 10x | 5x |
Battery life for low power | 1000x | 1x |
Energy reduction / bit | 1.5x | 1.3x |
Two conclusions stand out:
- 5G is a long way from delivering on the original promises.
- 4G was actually more revolutionary than 5G.
That is not to say 5G is a bad thing – it delivers gains in some areas and those are to be welcomed. But it does not, currently, look like it merits the tag of “a generation like no other”.
[1] https://www.fiercewireless.com/tech/5g-base-stations-use-a-lot-more-energy-than-4g-base-stations-says-mtn#:~:text=According%20to%20Huawei%20data%20on,2G%2C%203G%20and%204G%20radios.