5G… and beyond: Three standardization challenges for the future of communication

Generations of mobile technologies come around at the rate of about one every ten years. Ten years is also roughly the amount of time it takes for them to be created. No sooner has a generation been offered to end consumers than researchers are working on the next one. Therefore, it is hardly surprising that we are already seeing technologies which could appear in the context of a 5G+, or even a potential 6G. That is, as long as they manage to convince the standardization agencies, those who decide on the technologies selected, before their final choices are made by 2019.

Researchers at IMT Atlantique are working on new, cutting-edge technologies for transmission and signal coding. Three of these technologies will be presented here. They offer a glimpse of both the technical challenges in improving telecommunications, and the challenges of standardization presented ahead of the commercial roll-out of 5G.


Turbo codes: flexible up to a point

Turbo codes were invented at IMT Atlantique (formerly Télécom Bretagne) by Claude Berrou in 1991. They are an international standard in error-correcting codes. In particular, they are used in 4G. Their advantage lies in their flexibility. “With one turbo code, we can code any size of message” highlights Catherine Douillard, a digital communications researcher at IMT Atlantique. Like all error-correcting codes, the better the transmission quality, the more errors they correct. However, there is a threshold beyond which they can no longer improve their correction rate despite an improved signal.

We have recently found a way to solve this problem, which has led to a patent with Orange” explains Catherine Douillard. Turbo codes could well continue as the inevitable error-correcting codes in telecommunications. However, the standardization phases for 5G have already begun. They are divided into three phases, relating to the three types of purpose the new generation is expected to fulfill: increased data speeds, extra-reliable communication, and machine-to-machine communication. For the first purpose, other error-correcting codes have been selected: LDPCs, based on the work of Robert Gallagher at MIT in 1960. The control channel will be protected by polar codes. For the other purposes, standardization committees are due to meet in 2018. Turbo codes, polar codes and LDPCs will once again be in competition.

Beyond the technological battle for 5G, the three families of codes are also being examined closely as scenarios for the longer term. The European project H2020 Epic brings together manufacturers and researchers, including IMT Atlantique, to look at one issue: the increasing speed of error-correcting codes. Turbo codes, LDPCs and polar codes are also being examined, worked on and updated at the same time. The goal is to make them compatible with the decoding of signals travelling at speeds of around a terabit per second. For this, they are directly implemented in the material part of mobile terminals and antennae (see the insert at the end of the article).


FBMC: a new form of wave to replace OFDM?

If 5G is to combine communication between connected objects, it will have to make space on the frequency bands to allow machines to speak to each other. “We will have to make holes at very specific frequencies in the existing spectrum to insert communication by the Internet of Things” says Catherine Douillard. But the current form of the standardized wave, called OFDM, does not allow this. Its level of interference is too high. In other words, the space in the frequency band is not “clean”, and would suffer from interference from adjacent frequencies. Another form of wave is therefore being studied: FBMC. “With this, we can take out a frequency here and there to insert a communication system without disruption” the researcher sums up.

FBMC also provides a higher quality of service when mobile terminals move quickly in a cell. “The faster a mobile terminal moves, the higher the Doppler effect is” explains Catherine Douillard, “and OFDM is not very resistant to this effect”. And yet, 5G is supposed to provide good communication at speeds of up to 400 kilometers per hour, like in a TGV train, on the classical 4G frequency bands. The advantage of FBMC is even more significant on millimeter frequencies, as the Doppler effect is even greater at higher frequencies.

OFDM is already used for 4G, and for the time being, the 5G standardization agencies are maintaining it as the default wave form. But we probably haven’t heard the last of FBMC. It is more complex to set up, but researchers are working to simplify its implementation. Again, the next phases of standardization could be decisive.


NOMA: desaturating longstanding frequency bands

The frequency bands currently used for communications are becoming ever more saturated. Millimeter frequencies certainly could alleviate the problem, but this is not the only possibility being explored by researchers. “We are working on increasing the capacity of systems to transmit more data on the same bandwidth” explains Catherine Douillard. NOMA technology puts several users on the frequency band. Interferences can be avoided by allocating each user a different power level, known as multiplexing.

“The technique works well when we associate two users with a different channel quality on one frequency” the researcher explains. In concrete terms, a user situated close to an antenna can use NOMA to share the same frequency with a user further away. However, two users who are the same distance from the antenna, and so with more or less the same quality of reception, could not share it. This technique could therefore resolve the cellular saturation problem which 5G aims to address.

This article is part of our dossier 5G: the new generation of mobile is already a reality

Algorithms implemented directly into material

Algorithms are not necessarily lines of programming code. They can also be directly implanted into integrated circuits, using transistors which act as logic gates. “The advantage is that these algorithms take up a lot less space this way, as opposed to when they have to be executed on processors”, specifies Michel Jezequel, head of the electronics department at IMT Atlantique. “Algorithms are also faster and consume less energy”, he continues. To increase turbo codes to speeds of around a terabit per second, there is no choice but to use material implementation, for example. Their homologue programs would not be able to process the data fast enough.

Leave a Reply

Your email address will not be published. Required fields are marked *