5G specs not as final as they seemed – are 3GPP processes still fit for purpose?

The number of features that have needed to be standardized for 5G, and the number of submissions for each one, have been greater in 5G than 4G, by 10 times or more, say the companies involved in the 3GPP process. The intensity of the development was illustrated this week when the 3GPP released a set of change requests for its already ‘finalized’ 5G New Radio standards. These changes show how complex this new network is, and how there will never be a ‘final’ set of specs to address all the requirements.

But they also lend weight to criticisms of the 3GPP’s decision, last year, to fast track a subset of standards to get 5G to market more quickly – an approach which Nokia and others said was too hasty and distracted from work on the core, with the risk of bugs and gaps being found later. The new changes should not delay first deployments because they relate to software layers which do not need to be implemented in 5G NR NSA. But they will spread uncertainty and questions about whether the standards processes still work in the 5G era, or whether new structures like open source would be more appropriate.

Many vendors are talking about a more flexible, granular process, which would adopt technologies like microservices to enable a stream of small changes to be issued, rather than a major release every couple of years. That change took place long ago in the computing world, and as that converges with telecoms, we can expect that, along with virtualization and cloud compute, the telco’s platform will embrace more flexible ways to develop new features and share them.

The rising impact of open and open source processes, not just on telco cloud platforms but the mobile network itself, will help to hasten this shift of attitudes, urged on by initiatives such as Facebook’s Telecom Infra Project (TIP), which are seeking to bring the norms of the cloud world to operators. But for now, companies have to live with a standards process which increasingly looks far too slow and rigid for the rapidly evolving 5G platform, and the diverse use cases it is designed to support.

The 3GPP said its new change requests are not backwards compatible, which means operators and vendors will have to agree whether the final standard includes these additions, or whether to stick with the Release 15 specifications agreed in June (or a hybrid of the two).

This could cause new uncertainty for operators, and a degree of wasted effort for vendors which are testing chips, devices and infrastructure for commercial products to be released next year.

“We have not announced which version our products will be based on. These changes are relevant to both base stations and devices, but there is no need for any new chipsets,” Lorenzo Casaccia, head of 3GPP standards work for Qualcomm, told EETimes.

The eight changes are all labelled ASN.1 which means they are at the software layer, and at least five of them came from the 3GPP RAN-2 group, which is mainly focused on software functions, not the radio physical layer. So while it seems these changes are more than just bug fixes, they should be able to be supported with software or firmware updates, not a major redesign.

One of the issues raised by such developments is whether the Release 15 standards have been rushed out the door. The advantages of a rapidly changing, microservices-style process are clear, but there are also benefits to a robust standards process which takes the time to ensure that all requirements are met, and bugs fixed.

When the 3GPP gave in to pressure from some operators, led by AT&T, to fast track the non-standalone variant of 5G New Radio (5G NR NSA), critics of last year’s decision voiced concerns that this would distract from efforts to get the core network fully specified, and could delay availability of a total platform.

Mike Murphy, chief technologist for Nokia in North America, told EETimes that, in its rush to market, the 3GPP “tried to push the spec too quickly, targeting December last year, and as a side-effect a lot of change requests are coming through. People thought all the non-backwards compatible ones would be finished in June.”

When the 3GPP decided to do Non-Standalone first, some claimed the splitting of the NR work would delay essential aspects which should have been finalized in tandem with the radio. In particular, Nokia and others raised fears that the work on the packet core and the RAN was becoming too separate, because NR NSA does not require the 5G core. Enrique Blanco, CTO of Telefonica, argued when the split was announced in March 2017 that it was a backwards step, preventing 5G NR technology from evolving to support emerging use cases.

Later it emerged that some Release 16 work had been delayed because of the resources needed to get 5G NR NSA completed in time. At a plenary 3GPP meeting in Japan in November 2017, several study items were postponed. Balazs Bertenyi of Nokia, chairman of 3GPP RAN, said at the time: “Given the challenges we have to finish Release 15 on time, we are going to put the study items on hold.” The delayed items included 5G-Unlicensed (which may not make it into the standards until Release 17); non-terrestrial network (channel modeling); an enhanced V2X evaluation methodology; integrated access and backhaul; and non-orthogonal multiple access (NOMA).

Cohere Technologies was another critic. It had submitted its OTFS (Orthogonal Time Frequency and Space) waveform to 3GPP RAN1, supported by AT&T, China Mobile, Deutsche Telekom, Telefonica and Telstra. Cohere VP Anton Monk said of the fast tracking decision: “Some big vendors wanted to get something done fast and slap a 5G name on it” and that meant the first standards added little to what was also being developed for LTE-Advanced Pro, rather than considering whole new approaches like Cohere’s. Monk described the first wave of Release 15 as “LTE with Massive MIMO and beamforming — nothing really new except for including Huawei’s polar codes in the control channel and using LDPC (low density parity check) everywhere else.”

The other key issue raised by these new Release 15 updates is whether traditional standards processes like the 3GPP’s are still fit for purpose in a world where requirements and behaviors change so rapidly, and most operators need maximum flexibility to address any use cases 5G may throw at them. By contrast with the fluidity of the markets in which MNOs find themselves, the processes seem slow and rigid.

The WiFi Alliance has been particularly effective at reducing the risk for its members to deploy equipment before standards have been finalized, even introducing certification for pre-standard specifications. The 3GPP does not have this staged approach, and pre-standard implementations are expensive and risky, and therefore rare. NTT Docomo spent years bringing its homegrown version of UMTS into line with full standards; Verizon has invested a great deal of time and money in its pre-standard 5G fixed wireless technology, though the signs are that this will be less arduous to align later.

Most operators, though, are stuck with waiting for the 3GPP’s wheels to grind before they can deploy. Sometimes specifications appear with gaps or bugs, as in this case, or where some capabilities lag the market requirement (5G-Unlicensed, or full ultra-low latency support, for instance, will wait for Releases 16 and 17). This raises understandable interest in adopting the norms of the open source community.

But there are risks as well as rewards in the open process, and most players are looking towards a hybrid approach where the speed and broad-based innovation of open source can be tempered by the clear direction and strong quality control of a formal standards process. Indeed, the Linux Foundation itself has published a white paper outlining how different open source initiatives, and standards bodies, could work together to avoid duplication or fragmentation.

On the sceptical side of the ring, Shahar Steiff, an assistant VP at PCCW, said recently that open source “only provides half of the things we need – the code but not the information model or standards. Yes, it is faster than proprietary code, but with a standard I don’t care if it is open source or proprietary code.” And he added that, while open source was ‘free’ upfront, it cost a great deal to implement effectively in an operator’s systems.

Even AT&T, whose ECOMP is now the foundation of the open source initiative ONAP (Open Network Automation Platform), has some doubts. Rupesh Chokshi, an assistant VP at the telco, told LightReading that open source will not deliver the same “quality, performance and reliability” as proprietary technologies, or those defined by conventional standards bodies, and this will be particularly sensitive in the area of security.

But others see open source processes, when they are made robust by broad industry participation and a disciplined governance, as the way around the limitations of the IEEE and 3GPP. Sprint’s VP of technology Ron Marquardt wrote in a blog post last year: “Open source is a model that works. Open source development allows for very rapid innovation, with a vibrant community providing a high volume of contributions at a relentless pace. The open source model streamlines processes and removes confusion on specifications and their interpretation because the actual implementation is what matters, not the verbiage of a written document … The lines between open source prototypes and normative standards are blurring, and Sprint will continue to contribute to these advancements in our industry.”

Foundation calls for cooperation with standards bodies

The Linux Foundation, which hosts several important carrier-oriented open source projects, has been calling for telecoms standards bodies to work together with the open initiatives to accelerate progress and avoid damaging splits or fragmentation.

Multivendor interoperability and automation across multiple NFV/SDN efforts will require close coordination, the Foundation wrote in a white paper last year, and it continues to push for recognition, on both sides, of what each does best.

The white paper, entitled ‘Harmonizing Open Source and Standards in the Telecom World’, points out legal and intellectual property challenges and ways to work together, but also builds on a unified architecture various open source groups with existing standards bodies and pointed out areas where they were overlapping or duplicating effort.

The Foundation’s head of networking, Arpit Joshipura, said: “There’s a place for standards and there’s a place for open source, and the two of them can be the best of friends.”

He added: “As a neutral party, we would like to facilitate those conversations, but some of that may take place in the standards groups themselves. The first step is project by project, standard by standard. We will look at introduction goals, what are complementary things.

“We agree on the end results but the paths are different on how we get there and when the paths are different, it’s important to get terminology aligned and processes aligned and start the dialog for the actual architecture and the integration. Open API does not mean the same thing when I say it as when a vendor says it or when another open source project says it.”

The major open source networking initiatives hosted by the Linux Foundation include OpenDaylight, OPNFV, ONAP,, Open vSwitch, OpenSwitch, IO Visor, ON.Lab, CORD and ONOS.