The reasons why 5G can never happen

That’s right we mean it? 5G is in danger of never happening, or certainly not happening in the manner and timescale that we are all being told and there are many reasons for this, not least of which is the amount of complexity 5G plans to take on, and how much it will cost.

In essence the use of spectrum above 6GHz is likely to be slowed dramatically, and less than 1% of only very advanced territories will be built out by 2020, and that is not enough revenue to keep all of the major players alive and healthy. So something MUST give.

It is well understood that the 3GPP will continue development of LTE until 2020 at least and does not expect it to be replaced as the dominant technology until 2030. If it takes that long how will backhaul specialists and base station providers survive?

We are constantly told that 5G will be far cheaper to implement, and in some areas it will, NFV and SDN make this possible in terms of what is paid for base-station equipment, but little else in the technology chain will reduce in costs. We estimate that a handset today being made with to the specification would cost at least $1,800 to build. And if the US insists of moving early, thus cementing the spectrum 5G sits in, it risks losing out in the economies of scale that 5G handsets need. That move is disastrous.

How do you survive this glacially slow transition? By planning it properly.

Our service RAN Research, is the only service which forecasts 5G base station deployment from the demand side – by taking into account what operators can afford to spend and plan to spend, rather than listen to what vendors tell you they will sell. That’s one good way to survive it.

What is clear is that MNOs will take the quickest fixes fast and then be very slow in deploying small, dense cells – a process which will barely keep the leading telecom equipment vendors alive. 5G was never meant to be a root and branch upgrade, but by allowing it to co-exist with LTE, it means MNOs can take their time over it. Perhaps 15 years to fully deploy it. Perhaps 25.

It seems to us that we are engaged in a desperate fantasy that 5G is almost here, and this fantasy is being played out by chip makers and equipment vendors, desperate to panic operators into sharing the fantasy, in order to prolong their position as the technology stocks of the day.

This part of the equation cannot happen unless the industry finds some new revenues – it is not UHD video that will pay for 5G, as the people that use it have shown no propensity to pay, and are perfectly happy taking it over WiFi. So where will the money come from?

Never before in the history of telecoms has a technology been put forward to solve so many problems at once, without anyone being clear on who will pay for what, in what timeframe. The uncertainty is palpable.

Stock markets will have seen this same journey when canals were surpassed by the railroads and the railroads were overtaken by the invention of the car. Companies are valued on their ability to make profits, and while handset vendors and operators alike continue to have outlandish stock valuations, the sudden halt of smart phone growth, virtually with us now, will inevitably mean that their day will soon be over. AT&T, Verizon, Apple, Nokia and Ericsson all share the same fear. The cellphone is ready to become a commodity, like a rail journey or an electric plug and the R&D and competitive jockeying is set to go out of it, because no-one can afford it.

The handset is set to join the microwave, electric cookers, cars, and personal computers, as things society already knows how to do, but now want on the cheap. The only antidote to that is to sell an impossible dream that the cellphone can become “everything,” as it fights off investor interest in self driving cars, AI, automated drone delivery, virtual reality and social media. The only answer is that in order to have all those things, you need 5G. But do you?

The answer seems to be to sell a vision that encompasses and hijacks the nascent Internet of Things, and to position 5G as the future of the networked enterprise. And in order to do this, 5G has become an unachievable science project, a political party that tells everyone it can do everything with its oh so limited budget.

The first and most ridiculous lie is the timeframe. We are told at every angle that 5G will be ready to install for the first time by 2020, but we have to wonder when operators will find time to install LTE Advanced before then, as no-one yet has it in place and if you read most publications, who for the most part are simply, at this stage, repeating the claims of all the companies planning to dominate 5G, it seems like LTE-A does not even exist anymore, and yet it is a clear stopping off point along the way.

Back in 2002 3G spectrum was sold and by 2003 we began hearing the term High Speed Downlink Packet Access (HSDPA), which was referred to as 3.5G at the time, although that was as 5G is now, in pre-certified trials. HSPDA went through 6 iterations, taking network data speed from 1.6 Mbps to close to 100 Mbps (84.4 Mbps). At the time NTT DoCoMo said it was experimenting with 4G, which turned out not to become standardized until 2008, a gap of 5 years.

In the mean-time variants of HSDPA (usually generically referred to as HSUPA) took the baton onwards up to 84.4 Mbps, but this was a period when MNOs were transitioning from a virtual connection to a packet based system. Effectively the base station took a break from UMTS for about 10 milliseconds, and sent out a totally different signal that carried data on what was called the Enhanced Dedicated Physical Data Channel. We also introduced IMS, which was technically a transition fudge, to take us towards an IP-packetized world, effectively converting in and out of that format at the MNO network edge, something pretty unnatural. Ask any operator how their IMS installation went, and you will receive a scowl as reply.

3G was not so much about creating more capacity for data applications, as accommodating them for the first time, and embracing OFDM as a form of IP friendly modulation which would kill off WiMAX. LTE was about acknowledging and completing that transition. Both represent wasted opportunities to carry more and more data, and now in 5G, all of that has come home to roost and the community is now desperately trying to catch up. Which is why 5G has to be so ambitious, and why it is destined to fail – at least it will fail in its key aim – to keep cellular valued more highly than any other technology.

So where are we going with 5G and why do we call it a science project. Both LTE-A and 5G or New Radio (NR) as it has been recently dubbed, rely on two simple ideas. The first is that we introduce a lot more base-stations, so that less people share its bandwidth. That concept is simple enough, if you put a base station in a room with 10 people each one can have 10% of its bandwidth all the time. If you have two base-stations and each one is only in reach of 5 users, then each can have 20%. Today’s macro base-stations can reach hundreds of concurrent users, so the 300 Mbps which are common in LTE networks are being shared across hundreds of people, often resulting in a signal of a few Mbps per person.

The second key concept is that with smaller base-stations, where we are operating in a situation which is almost “line of sight” we can use bandwidth that was previously seen as “not much use” reserved for line of sight applications such as wireless backhaul or microwave. This is why new spectrum is being discussed above the 6 GHz line, and why the US wants to see 37 GHz to 39 GHz made available. This type of spectrum, in the OFDM configurations which are planned, will bounce and get around the odd obstacle, but will not go through walls. In fact, it will not even go through glass.

Experiments conducted in preparation for 5G to see how this type of spectrum behaves, concluded that metal coated glass which is used in all modern office buildings would create a 40 dB signal drop, with one 10,000th of the signal strength getting through. So these signals can get through normal glass, but not the shiny stuff you see in skyscrapers, and certainly not concrete and drywall. So if you have a base-station outside, it won’t reach inside and vice-versa. That means every home will one day have to have its own base-station, or get poor signals. It might mean every room needs one. – except we have already got WiFi. Full 5G will mean each consumer home buying their own base station. How are MNOs going to convince consumers to do that?

This immediately creates a lot of complexity at the network level. First off, what power levels should be used with spectrum that attenuates in the air or at the first obstacle it meets? And if you really have a lot of small cells, what is to stop them all interfering with one another when there are no obstacles.

The interference issue has had a lot of thought put into it over the term of several 3GPP standards. A core concept in small cells is deliberately managing interference dynamically using protocols called eICIC and CoMP. the older ICIC is about mitigating interference between a large Macrocell base-station and a small cell. This earlier version existed in Release 8 and dynamically lowered the power of sub-channels to prevent interference; but the Release 10 version or eICIC coordinates the blanking of subframes in the time domain in the macro cell, with the resulting downside that macro cell capacity is diminished. It’s a bit like that 10 MS break we used to have for HSUPA, the Macro cell stops speaking so the small cells can hear themselves.

CoMP is a bit like carrier aggregation, but instead of the same message being carried across two different frequencies, it is a single message repeated on the same frequency so that like a single frequency network, two or more base-stations use the least effort to get a quality signal to a single device. These are untried, and yet the entire 5G network rests on them working perfectly first time.

These both require significantly more processing power to make these adjustments, so LTE and LTE-A are sacrificing cloud CPU cycles to save bandwidth, but also it uses more and more backhaul bandwidth on coordinating signals. So once some central intelligence has decided how best to use the frequency, it has to send all base-stations and devices the calculations to do it. There are some of the best minds in the world working on interference schemes, but they, in turn, throw up their own problems.

In 5G, radio planning becomes an issue. There is one Qualcomm paper which is all about proving that the Friis Free Space path loss equations, which are used to calculate the loss of power in radio just through empty air, still work in radio planning on 5G spectrum. There are issues of diffraction that make this not obvious and not entirely accurate, it more or less works, but there are issues.

So people that have spent the past 30 years knowing how to plan radio networks may have to retune their basic equations and toolkits when planning above the 6 GHz line and yet one mistake will mean a massive public relations disaster for the technology.

But re-designing reliable radio planning is just one of the issues at hand. Others include designing handsets cheaply which are inordinately more complex and which offer more and more onboard radios and antennas, so that the devices can speak through all the spectrum allocated to an entire Het Net. Initially 5G devices will be way more expensive, and this will present operators with a problem in how to motivate customers to move up to NR devices and may operators re-introduce handset subsidies, even in markets where they have been eliminated. And what of battery life? If we built one of these right now, it would be around an hour, but, we have four more years to change this. So it might be two.

Offering incentives for customers to move to a network where the devices are going to cost the operator more is somewhat nonsensical in its own right, and handing out batteries with significantly shorter battery time is going to be tougher this time around – this happened with both 3G and 4G, but consumers are now savvy and suspicious and anyway the differences back then were not quite so bad.

Some vendors, notably Nokia, have suggested that there would be 3 layers of a hetnet, using spectrum under 6 GHz, from there all the way up to 30 GHz and from there all the way up to 100 GHz. Each of these would require separate radios and antennas, and need MIMO 8 x 8 antennas for one level, 8 x 8 again for the next level and finally 2 x 2 for the 100 GHz level. That’s 18 antennas and radios which don’t interfere with each other ADDED to existing deigns which already support LTE, 3G, WiFi, Bluetooth and GSM spectrum, so have about 7 radios in them anyway. But radios that can sweep across 30 GHz of spectrum and work with any part of it are untried, and expensive.

There would also need to be MU-MIMO logic built into each chip. There is no guarantee that these would work initially and they may have huge power drains, and we won’t be able to design them properly until we have the networks. They will be exceptionally expensive until they go into volumes of 100s of millions. Then they will need to fall to almost nothing in price over a 2 to 3-year period. That’s not much reward for the handset vendors, who will drag their heels and press to go back to subsidies or high unit pricing, or both.

Another major issue is harmonizing spectrum from one country to another. The World Radiocommunication Conference in 2019 has an item on its agenda to discuss 6GHz up to 100 GHz with a view to harmonizing its usage across the world. It is unlikely that anyone will wait that long, so there is every chance that many regulators will take a flyer on the preparatory work prior to that conference, and almost certainly the US in particular (but perhaps Korea and/or Japan), may decide what spectrum they are going to use and take the lead ahead of that conference and then try to bully everyone else. But the spectrum that the US wants to use is not available in most other countries, which means economies of scale may never reach handset makers. That alone will break 5G and recent moves by the FCC look like this is destined, disastrously, to happen.

Certainly we cannot wait, agree conclusions in 2019, ratify it a year later, and THEN decide to build equipment and still expect 5G to arrive by 2020. There will have to be some gambles, guesswork and gentle political persuasion, starting now. One misstep and whomever makes it is dead. Don’t let that be the idea you bet your company on?

But the biggest issue is where we go with backhaul and front haul. Just to iron that out, backhaul of course is very similar to front haul and there are recommendations to merge the two. The control instructions from the cloud or central switch or consolidating CPU need to have constant control communication with the Remote Radio Heads, as well as there being a need for genuine user data backhaul.

Additional control information will also come from a massively increased number of handovers. If you live in a world where there is a new cell for every urban canyon, or square or mall segment, then your central switching apparatus is constantly involved with handovers, and that also provides more and more data movement, which eats bandwidth and CPU cycles.

Typically, how much backhaul we might need? If we assume that a 2.1 GHz LTE network has a cell size as small anything from 2 kilometers down to 500 meters in an urban environment, and remember it could be much larger. In order to simply cover that with cells that are between 50 meters and 100 meters, there would need to be around 30 to 50 small cells.

This would provide tons of capacity, spread across 2 GHz of millimeter wave spectrum, and using channels of 100 MHz a time, as many advocates have suggested, offering minimum speeds of 100 Mbps for any given user even at the cell edge, but these would total tens of Gbps for any cell site, with many users getting a 1 Gbps experience personally.

But this design will lead to between 30 and 50 new backhaul points being needed for every one there is today, with data speeds for backhaul rising fast. If you want 30Gbps possible across a single area, backhaul must be incrementally grown to get to that level by the time you need it and to build a network at all, it will likely require 1 Gbps at every point when it is first installed – i.e. a similar level to the backhaul a Macro site would need today.

We hear LTE speed landmarks all the time of 300 Mbps, 450 Mbps and then 600 Mbps on a single LTE device, world records for the development of LTE, so we are hardly going to have less than 1 Gbps when we first implement 5G backhaul and would need a development path to speeds which are a lot faster.

So 30 sites for each base-station installed today and each of these to have independent backhaul that would scale up to 1 Gbps from day one. But costs should not go up. Given that backhaul will traditionally take up something like 25% of a carrier’s Opex, this cannot be installed at today’s prices, because that would mean a rise by a factor of 30 in backhaul alone to become around 750% of total Opex costs, a rise of more than 7 times current operational costs. So it’s most important to re-invent backhaul and make it far cheaper, but there are no standards in backhaul. Perhaps Broadcom will come to the rescue with its new Magnacom WAM protocol by then, upping backhaul useful data delivery by a few factors. Maybe not.

Naturally a lot of that backhaul cost relates to total data carried and total number of fiber or other lines leased, plus one off equipment costs. But cutting it to something like one thirtieth of today’s price is impossible and it is likely that this can only be achieved over a much longer timescale than by 2020, or even 2030, with operators buying just enough to say they have 5G, but not enough that 5G is anywhere much, while they wait for its price to fall further. The wait could be ten years. It is more likely to be 15. Don’t be the company that has a 15 years wait until your products are bought in volume. (Better to consult our RAN Research service and ensure there is enough business to go around).

Backhaul today is split roughly 60% wireless, and 40% fiber, but taking each Macro cell backhaul site and finding 30 more backhaul connections of the same speed is likely to inspire technologists to rethink the underlying technology. There is absolutely no consensus in the best way to achieve that kind of backhaul improvement and every backhaul vendor today is suggesting something different.

One company will tell you that the way forward is using point to multipoint wireless systems, which share backhaul across multiple sites, others will say that fiber is coming down in price fast enough to do this point to point, and others still talk about using Software Defined Networking in the backhaul to provide a flexible and updatable systems which can carry what is sometimes known as Cross Haul, which is the process of using the same connections to manage front haul and back haul applications, prioritized accordingly.

5G will require new algorithms for balancing network loads as the performance of this new network will operate nothing like any network before. Hence the mass ramping of both optimization tools and analytics to tell you what on earth is going on within your network. In our RAN Research Service, we have talked to hundreds of operators, and talk to many of them on a regular basis, and they have told us about their short term (fully committed) spending plans and their longer term spending expectations (planned) and they have outlined what they think their spending plans will be during this transition period.

Of course very few of them know for sure. Realistically vendors lead the way on R&D and initial expectation for operators, but once an operator has a fully scaled trial, they know far more than vendors about what they can afford and what they can survive with.

Today there are around 600 million fixed broadband lines and around 400 million of these have WiFi on them, and all of them will have it eventually. Something as low as 40 million of these are currently roamable by any consumers, but this will rapidly rise over the next 5 to 6 years until 80% of WiFi is roamable in some way.

Our RAN Research small cells forecast says that between now and 2020 all types of small cell shipments, not counting residential in-home shipments, will reach around 4.6 million. Some 77% of these today are all-in-one small cells, but slowly we see these being overtaken by virtualized small cells and distributed radio systems with the virtualized approach overtaking all other forms of installations come 2020 and 5G.

Even with the compound growth rate forecast of 107% (something that has almost never been sustained when selling to conservative, cost conscious operators) it will only lead to an installed base of 13 million non-domestic small cells by 2020. Plan for a higher number at your peril.

That calculation comes from what operators tell us or publicly state they plan to spend, but given that they all give a guideline to their Capex spend – up or down – we have also tested that against cash supplies. Investors immediately de-value a stock if the management try to spend too rapidly, over-chasing customers with cash. So we can forecast that spend fairly confidently. It is true that small cell spending still has not taken off much at present, but even this aggressive growth is not enough to swamp global MNO networks and far more spend by dollar will go on Macro cells, for some considerable time.

And of course none of these cells will be 5G at first anyway, that will start during 2020, and then all small cell purchasing can slowly cross over to 5G. In the same way that operators have been quick to build out LTE, but slow to take up small cells, they will explore 5G, putting off mass backhaul spending and avoiding the new spectrum choices – 6 GHz to 30GHz and on up to 100 GHz – for as long as possible, with all the consequential backhaul cost. The further back they put it, the better for them financially, but their networks won’t perform as well and they will lose customers and draw the ire of financial markets.

Like all operators – MNOs will want the revenues first and then pay the costs, and in 5G that’s going to be really tough. That’s what vendor financing is for, and Ericsson and Huawei will have to stretch that type of funding further and further.

While we freely acknowledge all of the advantages of even LTE-Advanced networks, never mind 5G networks, over WiFi with all of the intelligence that already goes into ensuring that the best results are achieved with the least spectrum and the fact that WiFi has considerable power limits imposed. But this will leave us in 2020 with about 13 million LTE-A small cells installed while WiFi First opponents, those who use WiFi to carry upwards of 70% of their data, will be relying on some 500 million homespots and maybe 10 million hotspots. It is an unfair competition with home gateways enjoying a ten-year advantage over small cells. That advantage may not last forever, but it will last until some operators go broke or are forced into a sale.

While the WiFi advantage does not represent the capacity to entirely replace the MNOs, it is sufficient to set a price ceiling for existing communications services, based on smartphone usage. We saw one report last month which suggested that US cable operators would grab 20% of MNO customers in the US by 2021 using WiFi First models. With current service revenues around the world tending to fall for given populations of cellular subscribers, this is likely to make both Verizon and AT&T far less profitable in the near term. The resulting debt rating realignment, would likely make the rapid build out of 5G impossible to pay for even though Verizon would have the extra revenue of its MVNO agreements with US cable to help it out. We believe that similar spending restrictions will apply in almost all countries of the world.

Also in June another US analyst group MoffettNathanson reasoned that Dish would have trouble unloading its spectrum because none of the MNOs in the US can afford to buy it. They all have enough in the below 6GHz range, and are saving their pennies for 37 GHz etc…  once the 600 MHz auction is out of the way.

How long 5G will take relies on how long it takes for tier 1 players to replenish their cash supply after all this auction activity? Well that depends. If MNOs continue to go after existing markets – smartphone owners, and the market which defines their needs – UHD video delivery, then they are likely to NEVER replenish their cash reserves, as this generates less cash per customer every year. ARPU numbers, with some exceptions are already headed down.

Instead MNOs must seek out radical new markets – 5G will only justify itself if it enables radically new behaviors and business cases, many of which will involve technologies such as robotics, which have never before overlapped significantly with the cellular network. In other words, it must become the partner network of all the other new and sexy technologies on the verge of mass take-up. For the enterprise it must not only carry IoT data, but do it cheaper and at an insofar undreamt of scale. It must crush the pretenders to low power Wide Area Networks such as Sigfox and LoRa. It must make private LPWAN deployments nonsensically expensive compared to doing things at scale with 5G. So NB-IoT is as important as 5G, if not more so. And yet it measures its loads in Kilobits per second – easily achieved with today’s technology and not at all reliant on 5G.

The overall effect must be that M2M BECOMES IoT and more. In such a way the enterprise revenues of cellular will growth organically and quietly and fight the competitive erosion of voice and personal data pricing among consumers, and uncover markets like vehicle to vehicle (V to X) communication, once again, at scale. It doesn’t help that 5G hardly addresses this as its design focusses on UHD video delivery, not M2M excellence. It seems to be the wrong tool, designed at the wrong time for the wrong job.

The need to build out the smaller cells at the top end of the data delivery curve (100 GHz) and the bottom end of the data fragility curve (Line of Sight) will have to be aligned with specific corporate, government, and utility projects that will take over vast quantities of Gigabit class communications. Failure to do that will mean 5G spending will halt, not so much making the system stillborn, as making it have an unimaginably long adolescence.

But this is a chicken and egg problem (which came first) which will unravel over a 10 to 15-year period. 5G will not appear like 5G at first, until it is built out to small cells across all spectra. By 2030 optimistically, 2035 is more likely. And at that time none of the companies we have talked about will be stock market darlings any longer.

So it is just as well that 5G components will sit happily inside a sub 6 GHz LTE and LTE A network, just as easily as in a pure 5G network. No root and branch upgrade will be needed this time. 5G will not involve another big bang upgrade (economically and logistically unthinkable for most operators for another decade). Instead, it will be phased in gradually, alongside LTE, and on the MNOs’ own timetable. Which is why we won’t really see it as consumers until say 2025/30 or later.

A key issue will be what economies the mass uptake of WiFi allows operators to charge. The “disease” of consumers thinking that all WiFi is free, will not be cured by paid data becoming cheaper. It has to be perceived as being free. So in order to retain the existing core revenues of MNOs, while they work to transition both to 5G and an entirely new growth model, they will have to adjust to WiFi pricing, which means continual erosion within their pricing models.

The combination of a lowering price ceiling, a disenchanted customer base, and the increasingly capability of WiFi, will create a tough sandwich with a constant capex spend and new marketing initiatives to find new classes of customers. This is a toxic cocktail that will finish off many MNOs, pushing them to consolidation before 5G revenues take a hold. And any equipment vendor not taking account of every step of this journey will evaporate. So unless we get ALL of this right, tech businesses will only become victims of 5G rather than beneficiaries.

VPS hosting