Mobile network operators are adopting the virtualized radio access network (vRAN) model because of its various potential benefits, such as reducing total cost of ownership, increasing performance and scalability, and future-proofing networks for 5G upgrades, noted SNS Telecom in a summary of its recent report focused on the vRAN market now through 2030.
SNS Telecom defined vRAN as an implementation in which some or all baseband functions are run on commodity hardware as virtualized network functions (VNFs), separated from the remote radio unit. Despite all the hype, it’s early days still for vRAN, with “most investments focused on virtualized small cells for targeted greenfield deployments and pilot engagements for macrocell coverage.”
That will change as operators begin to realize the benefits of RAN virtualization, SNS Telecom predicted, and by the end of 2020 the vRAN deployments market will be worth approximately $2.6 billion, boosted by the race toward 5G.
The role of NFV
Network functions virtualization (NFV) is often touted as being a crucial component of deploying vRANs, but how to implement this technology in the real world is turning out to be a lot more complicated than initially thought. Business models and interoperability remain universal challenges that haven’t yet been solved, although the specifics of those and other obstacles vary from one operator to another, said Light Reading’s Ray Le Maistre in a recent article about NFV developments over the past few years.
“Every communications network operator knows that network functions virtualization (NFV) is part of its future, so an NFV implementation strategy is needed,” Le Maistre said in the article. “The major problem facing operators, though, is that each one has a unique network and different challenges.”Le Maistre suggested that, to make NFV work, what’s needed is a lot more of the following: third-party testing, less diluted efforts to develop compatible management and network orchestration (MANO), and further development of NVFi products.
“What operators really need is the ability to be able to select the NFVi and MANO that best suits their current situation, future plans and resources and then be confident that they can either deploy third-party (or in-house) VNFs with the confidence that all the piece parts will interoperate and enable them to focus on delivering profitable applications and services to their customers,” Le Maistre summarized.
It’s possible to get to that point, he concluded, if concrete action is taken soon to move past current NFV challenges, especially around “MANO harmonization and the development of a single API that can bridge the NFVi and MANO platforms.”
Opening up to open source
Let’s not forget about the role open source is playing in telecom’s move toward virtualization. Light Reading’s Editor-in-Chief, Craig Matsumoto, fairly characterized open source for telecom as having been a bumpy road so far, requiring early adopters to get their hands dirty and do some groundbreaking DIY work.
For example, Telefonica eschewed contracting a larger integrator or vendor and instead did their own open source architecture development, Matsumoto said, citing that operator’s VP of networks innovations, Patrick Lopez, speaking at the recent NFV World Congress event. AT&T, also, created its own MANO platform (ECOMP) which it then released to the open source community for further development.
“Operators themselves have triggered the creation of multiple open source MANO projects, with some overlap,” Matsumoto noted. Despite all that activity, or maybe because of it, interoperability remains an issue.
Virtualization enables cloud
In a broader context, the importance of virtualization has a strong relationship with ‘cloudification’ although the two are not the same thing, noted Heavy Lifting analyst James Crawshaw in a recent article.
Virtualization, he clarified, means more use of software on a given amount of physical infrastructure; it can occur at network, compute, or storage resource level. Cloud computing, on the other hand, delivers shared computing resources on demand through public (internet) or private networks.
“Cloud computing makes use of virtualization to enable the elasticity and achieve economies of scale,”Crawshaw said in the article.
He elaborated that, in the context of NFV, it’s not accurate to say that all equipment vendors need to do is port applications from ASICs and FPGAs to virtual machines running on x86 processors. That’s because, where physical network functions are stateful applications designed to scale up, cloud-native applications are stateless in that there is “a clear separation between application processing and the associated data,” and are designed to scale out.
Therefore, “Unless we re-architect the application to be stateless we gain few of the benefits of cloud computing.”
Telecom carriers wanting to achieve cost savings by mimicking the operation models of web-based providers see network virtualization as a means to do that, said Dan Meyer, Senior Editor at SDxCentral, in a recent article. For example, Verizon’s virtualization plans involving SDN, NFV, and cloud are heavily focused on driving down costs and increasing market opportunity.
In line with that, Verizon recently “unveiled plans for an open source white box solution for its universal customer premise equipment portfolio targeted at enterprise customers looking to save investment on separate hardware appliances for virtual network functions (VNF) such as software-defined wide-area networks (SD-WAN), security, routing, WAN optimization, and other network function that can be virtualized,” Meyer reported.
Similarly, Meyer noted, Sprint recently mentioned its use of virtualization to drive down CAPEX, and CenturyLink even put a number on this benefit, saying it is on track to “see at least $200 million in annual capex reduction tied to the build-out of its virtualized network, with the full build-out still scheduled for 2019.”
We would add that it may take some time for operators to realize significant cost savings through virtualization, although that certainly is a reasonable business case for adopting new technology. Ability to compete based on QoE and achieving more agility are more likely shorter-term benefits of moving toward software-based networking.