To embed our video on your website copy and paste the code below:
<iframe src="https://www.youtube.com/embed/vAPoVskdFPQ?modestbranding=1&rel=0" width="970" height="546" frameborder="0" scrolling="auto" allowfullscreen></iframe>
Tony Poulos, TelecomTV (00:07):
Hi, I'm Tony Poulos from TelecomTV, and I'm at MWC 2026 in Barcelona. Today, I'm going to find out a lot more about AI-ready networks and unlocking new workloads. And to help me through all of this, I have four thought leaders from the industry: none other than Kyriakos Exadaktylos, who is the Head of Network Architecture Specifications and Energy Performance at Vodafone. You have a very long title, so it must be a very important job. That one is. Cristina Rodriguez, who's VP and GM of Network and Edge at Intel Corporation. Cristina, welcome. Great to have you on board. We have Daniel Borrás, who is Head of Marketing and Business Development at Samsung. Welcome. Thank you. And last but certainly not least is David Trigg, he is VP Telecom Sales and GTM at Dell Technologies. Now let's kick off. Kyriakos, since you're the closest, I'm going to ask you the first question.
(01:02):
With the anticipated network advancements of 5G Advanced, how critical is it for CSPs to adopt cloud native architecture to support the evolution, firstly, and when do you believe is the right time to make that transition?
Kyriakos Exadaktylos, Vodafone (01:17):
So very good. Good morning, everyone. So 5G Advanced is here now. We don't have to wait for 6G. It comes to provide us multi-service, multi-purpose networks. And on top of the 5G standalone, we can have new capabilities like network slicing, RedCap, multi-mission critical networks. And for all of that, you need an architecture to be ready and future-proof to support these new capabilities and AI and automation. So there is a paradigm shift from hardware-driven metrics to software-driven evolution and automation. And for all of that, you need openness. You need open interfaces, cloud native architecture, and Open RAN.
Tony Poulos, TelecomTV (02:01):
Well, we're hearing a lot about that at this particular event as well. But Daniel, I'm going to flick over to you now. When we talk about networks designed to be open, as Kyriakos just said, and integrated, disaggregated and programmable, wow. How does this approach impact the way they are built, operated, and then upgraded?
Daniel Borrás, Samsung Networks Europe (02:21):
In practice, it means to move towards an end-to-end software-driven network across the radio access, the core and the transport. And basically what this means is that you don't have monolithic systems. The network is built from modular software components that are connected by open interfaces, and that's run on open hardware platforms. So with this, you are running towards a purely software-driven architecture that is future-proof and open.
Tony Poulos, TelecomTV (02:57):
Now for CSPs, David, it's still early in their journey for some of this. What are the first two or three steps that you recommend to modernise their foundation without disrupting current services? It's the critical part, without disrupting their current services.
David Trigg, Dell Technologies (03:13):
Well, that is the critical part and has been a big challenge because, in fact, to your earlier question, how does this impact them? It impacts them in all ways from everything around how they design, build, run, operate, monetise their network. The technology and the technology integration is really hard. And so you've got to start and start somewhere, but equally as hard are all the people changes, the process changes. And that's also really hard because you've got to continue to operate a legacy traditional while you're also learning how to run and operate the new. So what we've seen within telcos is the biggest problem they have is they're afraid to start. So you've got to start somewhere. And I know sometimes if you start with a single stack or a single component in your network, that still feels like a silo or a different kind of silo, but it starts to give you the foundation, the skills, the processes to be able then to expand and beyond.
(04:09):
But you've got to be very purposeful about how you design and operate that because if you do that incorrectly, it can lock you into just a different style of the stack. So you've got to make sure that that design can transport across your network as you start to put additional workloads or different capabilities. And so I always just say start somewhere. We see most telcos start somewhere in the OSS/BSS as it's a little bit more like a traditional data centre workload that's been going through this transformation forever, but then it extends further into the network, into the RAN. The good news is the telcos that have done this, and there's only a handful in the world, they're already seeing the operational benefits. They say, yeah, it was really painful, really hard at first, but now they're like, nope, it's proven. The operational benefits, it's proven.
(04:54):
So we're starting to see that, which is a great thing.
Tony Poulos, TelecomTV (04:56):
Well, the next question's open to all of you. So jump in. Looking at the ecosystem collaboration between Dell, Intel, Vodafone, and Samsung, how does each party contribute and what types of challenges are best solved together rather than individually? Cristina, do you want to start with that
Cristina Rodriguez, Intel (05:15):
One? Yes. Yes. I love that question because I think what is important at this point is that the technology in every different aspect is ready. If you look at the four of us, we have the Intel have this silicon underpinning the architecture ready for what the job that needs to be done. We have all talked about it, you all talk about it. They need to go to a cloud native, they need to go to a software defined network, open interfaces, and start deploying software that is going to have the whole wireless stack that is going to allow us to move at the speed or to move the innovation at the speed of software. Very important right now, be very flexible, very scalable, and be able to adopt all the new technologies. But then you look at us, how do we make that possible? Okay. We have Intel that have the right technology in the semiconductor, in our CPUs and SoCs underpinning that architecture.
(06:17):
You look at Samsung, they have the software running ... Well, before the Samsung, you look at Dell and they have the servers that exactly put together, built for the network. And then you look at Samsung, it has all the wireless stack running on those servers. And then you look at Kyriakos and Vodafone being the leaders in the industry, deploying for real cloud native, open interfaces, Open RAN with fantastic KPIs.
Kyriakos Exadaktylos, Vodafone (06:51):
Open that
Cristina Rodriguez, Intel (06:51):
Up. Real Open RAN without compromising any KPIs. In fact, you guys have talked about how good that network is. So this is where the ecosystem comes together. This is what we have done working together.
Tony Poulos, TelecomTV (07:08):
I think Cristina answered for all of you there. Yes. Everyone has a part to play.
Kyriakos Exadaktylos, Vodafone (07:13):
Exactly. And when we started with Open RAN, one of the most important things we knew since day one was there is one thing different from the traditional RAN with Open RAN that you have to make it right and you can only make it right with collaboration. And that's the system integration work,
(07:30):
Right? So the software release packaging, because we have Dell, we have Wind River, we have Samsung. So how you do this software release packaging, how you align the software release. So we set as a target this year. We said everybody, all parties have to commit to a single server configuration. So we got the Intel Granite Rapids and we said, we start with 40 core, 64 cores, 72 cores. How we are going to put that into the network that every site has a single server, no more than one single server, because that will allow us to be at low TCO, best energy consumption, and better performance for our customers. Well,
Tony Poulos, TelecomTV (08:10):
I was going to come to you on that question, that exact thing in a moment. How do Open RAN deployments compare to traditional RAN? We just touched on that. And what is Vodafone's experience and what's it revealed about the main barriers, whether technical, commercial, organisational and regulatory that hinder more widespread Open RAN adoption from a performance and cost perspective? It's a big question. But additionally, were there any unexpected findings as you rolled out in this area, either positive or otherwise, that stood out during this journey?
Kyriakos Exadaktylos, Vodafone (08:43):
Yes. So if I take the main pillars like performance, now we are deploying in Germany a full commercial network, real Open RAN in the urban environment. And we've seen that our KPIs now, both 4G and 5G are equal or better than the previous conventional RAN, which is a remarkable thing for the industry. That's not an easy part, but it was not from the arm, but we have now achieved. And what we want, we want to be even better. We don't want to stop there because when we introduce 5G Advanced, you're going to be even better to what you had before, not equal, right? Yep. So on the performance things are moving really, really good. And we will go to the first city now before the summer, okay?
(09:26):
Yes. Second thing you mentioned about cost. Yes. Of course, you have to match the right cost. Then what we did, the global tender, we've awarded strategic suppliers and there was of course economies of scale. You need volumes in order to create the right TCO and the right price. So we achieved a very competitive pricing for Open RAN. And now what it comes in the cost optimisation, we see how we can do things like server configuration, multi-band radio from Samsung. So you can have several bands into one single radio, and that will allow you to be far more efficient. But what will make you even more is the automation part.
Cristina Rodriguez, Intel (10:08):
The
Kyriakos Exadaktylos, Vodafone (10:08):
Thing with SMO, then Open RAN networks is that it will allow you to deploy at scale in a much more operationally efficient way. And this automation thing is the new thing that will allow you to deploy applications for energy saving, performance, and that's the new architecture. We have to take now the benefits from the new architecture.
Tony Poulos, TelecomTV (10:29):
Oh, for Cristina, this time, how is the open ecosystem with vendors, integrators, and operators working together to improve interoperability and simplify integration? I mean, you've got to get together and agree on what you're going to do,
Cristina Rodriguez, Intel (10:43):
Right? And we always have said that. It's the collaboration in the industry and the collaboration between the partners and the ecosystem, super important. It helps us, and this is important, it helps us that we're basing this architecture on an open platform, common platform, based on a general purpose processor, our Xeon products, and it's a platform that everybody knows. We have millions of software engineers out there that understand that platform. We have been in this industry for quite a bit. We have learned a lot. We have been very intentional in our roadmap for a few generations, and we have been again on this for several years. Every generation we have doubled even the performance per watt of our chips, and we double the performance within the same power envelope or lower. And as Kyriakos was saying, now we have come to Granite Rapids. This is our latest generation, CN6 generation, all the way to 72 cores.
(11:54):
But again, we have that platform working with Intel with Dell. We have worked for many years for Dell. We are one of the biggest partners in the world. We collaborate on what is the best design? How can we have the best power consumption? How can we have multiple SKUs that satisfy what Kyriakos was trying to do? Same thing with Samsung. We work very closely together. How do we optimise the software? How do we optimise memory? And this is collaboration. We talk every week. Our teams talk every day. We're working very closely. Understanding Kyriakos's requirements, of course, of what he's trying to do in the network, very important. So the fundamental part is that close collaboration
David Trigg, Dell Technologies (12:42):
Between us. And I think one of the most important things is, and we're very lucky to have industry leading customers like Vodafone that quite frankly is pushing us, right? And because without that, there's sometimes they're like, "You guys aren't working well together." And it kind of forces us, somebody that's willing to go out there and be industry leading because what we have to deliver hasn't changed, right? And we have to deliver high quality service to a customer, to their end customer, but how we're doing it changes ... Everything else changes, right? And so we're very fortunate to have an industry leading customer pushing us and then we're figuring it out and learning and then learning and learning. And it's been a great collaboration.
Tony Poulos, TelecomTV (13:23):
So I wanted to find out who was the driver. Who is the driver? Is it the customer? Is it the end user? You just answered that very well.
Daniel Borrás, Samsung Networks Europe (13:31):
It's very important to have a central coordination, like an orchestra conductor, to make everyone know exactly what they need to deliver, because it's a very complex ecosystem and projects and very fast speed like Germany, where we have to introduce the solutions very quickly. So this is extremely important.
Cristina Rodriguez, Intel (13:47):
And you were saying who's driving it? The industry at some point, I think all of us realised at some point that we needed to do things differently, that a purpose built, handcrafted or specifically done for the network was not going to scale. It wasn't flexible enough. It wasn't at scale. It wasn't future proof. We're talking about ... And even at the moment that we are ... And this wasn't ... We were not talking about AI years ago, but we knew that the point was going to come where we needed something that was more flexible, more open. And now I tell you this, I can't see AI being deployed fully to the full potential without a software defined open network. So the industry realised that. I think all of that got to a point that, okay, we need to make this happen, but I would say Vodafone has been at the forefront of the leadership of the industry.
Tony Poulos, TelecomTV (14:49):
You're not only the leader of the group, but now they're saying that you're maybe the-
Kyriakos Exadaktylos, Vodafone (14:55):
Leaders of the industry. No question. Access to make the noise.
Tony Poulos, TelecomTV (14:58):
Well, look, I'm impressed. But I'm going to ask you now, as networks do evolve, how do you see the shift from connectivity only? Because we always think of networks as just being the connectivity part, but we're moving from a connectivity only to a platform for services playing out, ideally leading to edge applications with the answer, I suppose. Where do you see that going and how critical is the new environment to do that?
Kyriakos Exadaktylos, Vodafone (15:22):
Yes. As you say, things are moving from basic connectivity to value added. So with 5G Advanced now, we can have multi-services, multi-purpose networks that we can deliver new service enablers on top, right? And that requires cloud native network software driven and with network APIs where you can expose certain capabilities. So you have quality of service, latency, ultra reliability, all of these becoming now exposed and consumed from applications. And this is where then you mentioned about edge, edge applications coming. The AI, as Cristina mentioned, it starts now from the vRAN. We need AI into the cell site. Why? Because we need to improve performance. That's the starting point, right? So you need that to put on the base stations, on the radio units to improve channel estimation, link adaptation and all of that. But when you go on the edge, AI will start introducing value for monetisation.
(16:20):
And monetisation, you can start from exploring certain verified capabilities like AI driven performance or optimisation, energy efficiency, multi-agent orchestration or all these capabilities you can start monetising on the edge, and that's the new thing. Now, you might need definitely new architecture there that is much more efficient at the low cost, but this is what will allow us to have also new revenue streams.
Tony Poulos, TelecomTV (16:53):
Look, talking about edge applications, Cristina, which edge applications, for example, XR, industrial automation, private 5G, smart cities, all the things we keep hearing about, are they gaining traction today and why?
Cristina Rodriguez, Intel (17:07):
Absolutely. Absolutely. And I'm going to pick up or continue, build up on what Kyriakos was saying, because this is how progress happens, right? Years ago, we look at what hyperscalers were doing and we said, "Hey, there's economy of scale by doing this kind of deployments on a server general purpose processor kind of architecture." So we took that and we focused on the core of the network first. Now it's practically completely highly virtualised. And then we said, "Hey, this makes sense also for the RAN." And we started to deploy on the RAN and make that happen, that virtualisation of the RAN. But then that very same architecture now comes to edge. Now this architecture has the capability and we're bringing this into the table to also do AI inference. We're in a moment where edge is important, we want to process the data where it is generated.
(18:08):
We have the AI capability now to do quite a bit of inference right there on the edge because we have that built-in AI within our CPUs. So you have a server that can do AI without any additional component. And now you have this box that you can put in anywhere in the edge, anywhere, any vertical that you were saying, industrial, retail, ports, you name it, right? Manufacturing floors. And now you have this box that can combine connectivity, network functions, security, wireless. And this is what we have done working with Dell. We have that platform that can be delivered on-premises for enterprise. And yes, tons of traction we have here, if you get a chance to visit our booth, we have an edge of a couple of demos on AI and all kind of things, again, in all those verticals.
David Trigg, Dell Technologies (19:07):
One of the things I think is critical in this with the onset of AI, the thing that's become really, really, really important is data, right? And where does data live traditionally and where is it being generated? It is at the edge, right? And it's too hard, too difficult, and too slow to move the data.
(19:27):
You've got to put the compute, you've got to put the processing, you've got to put the power where the data is, and it's going to force us to do this. And AI is really going to change the game and also the demands of the network, because if you think about it, the number of people with mobile phones has been relatively flat, right? And so that hasn't really changed a lot. But as Agentic AI and as agents, those are basically workers who, A, don't take a break, don't sleep. The demand and the number of users that are going to be on a network is going to grow exponentially again. And so the demands of the network, where the data lives, this changes everything. Well,
Tony Poulos, TelecomTV (20:05):
You're leading onto the next question. I was actually going to ... There's something ahead of me here because I was going to say, can you talk about the opportunities where the development of running enterprise workloads at the edge or on-premises can deliver benefits in terms of latency, reliability and data sovereignty, that's going to come in too.
David Trigg, Dell Technologies (20:22):
Yeah. Yeah. I think data sovereignty, it's always been a discussion, but given the world environment that we're all living in and the dynamics that we're all living, it's become critically important to making sure, again, back to my comment about data, where does the data live? How do you protect the data? What's the security? How do you manage some of the complexities of the world that we're living in? And you can't do that unless you really think about where is that data going? Where's it living? How is it being transported? And how are you using it to better people's lives, to make quicker decisions, to advance businesses? And so it's become critical.
Tony Poulos, TelecomTV (21:01):
I'm going to come to you in a moment, Daniel. Don't worry. I'm not leaving you out at all, but Cristina, how do CSPs balance the need for low latency and local processing with the economics of a centralised cloud? With Dell, to add to the commentary here as well, I'm going to ask Dell to roll into this one as well. But the question is, how do CSPs balance the need for low latency and local processing with the economics of a centralised cloud?
Cristina Rodriguez, Intel (21:26):
But that is the beauty of the architecture and having a common platform that goes from the cloud, the core, the RAN, the edge. So starts there. You want to maximise, you want to maximise the capabilities that you have. You need low latency. There are applications that you need to process at the edge because they require low latency. You can't send the data all the way to the data centre and back. So it's an architecture of your solution, right? So if you need low latency, you do it at the edge. And we have that capability and the software exists to do that. If there are the cases where you can afford to do more processing at the data centre, then you do that. But it's that flexibility and having the same architecture, let's say common platform and architecture end to end gives you that-
Tony Poulos, TelecomTV (22:21):
Flexibility. The critical data. Do you want to add to that, Daniel?
Daniel Borrás, Samsung Networks Europe (22:25):
Yes. Actually, as Kyriakos was mentioning, so we have one server per site. This is very important. This is the edge. And actually those servers, because it's a software driven architecture, they are already ready for AI. So basically we are going to be adding AI features to improve performance, especially to those servers. And this is going to be possible on the servers that we are currently deploying.
David Trigg, Dell Technologies (22:48):
Cristina,
Tony Poulos, TelecomTV (22:48):
Again, I'm going to ... I'm sorry.
David Trigg, Dell Technologies (22:49):
Sorry, but just one other thing on the flexibility point. Obviously compute computers where you put servers, like the stuff we have behind us, it's a very physical nature, right? And so it's really challenged us to really think about where and how are we deploying these things. So we've had to design, ruggedise. So whether you're putting it on a pole, in a manhole, wherever you're putting the compute, you're really forced to think about every different aspect, including the chipset, how's the software run, the temperature ranges. It really changes the game because we're used to most of the compute going into a sterile, raised floor, temperature controlled environment, right? And now we're having to deploy it everywhere.
Tony Poulos, TelecomTV (23:29):
Yeah. Well, Cristina, for partners and the ISVs, what's the best way to onboard and scale applications on a telco grade edge platform?
Cristina Rodriguez, Intel (23:39):
Yes. Yeah. It's
Tony Poulos, TelecomTV (23:40):
Okay talking about all this, but how do you actually do
Cristina Rodriguez, Intel (23:42):
It? Yes, yes. And Intel is, we're big on ecosystem, right? We have a very rich ecosystem and we have been working very closely with Dell in creating this catalogue where all the ISVs can come, they can port their applications, we help them with that, and then they become part of that catalogue. So every potential customer can go there and see, okay, this is the application that exists, this is what I need, this is what I need and choose from a selection. But again, we work very closely with ecosystem. We work very closely with the application. They know the platform. They have been programming for that platform forever. It's again, that collaboration.
Tony Poulos, TelecomTV (24:32):
This question's for both Kyriakos and Daniel, but I'm going to ask Daniel first. When we talk about AI in the network, what are the most important use cases today for vRAN and AI RAN? I know he's going to say a lot on this, but I'm letting
Daniel Borrás, Samsung Networks Europe (24:44):
You
Tony Poulos, TelecomTV (24:45):
Go-
Daniel Borrás, Samsung Networks Europe (24:45):
That's a good question. Yeah. Actually, I mean, there is a lot of progress right now. Things are moving extremely fast, but we see right now real value in deployment automation. So this is the first thing you do. You deploy the network, so you can deploy it faster and upgrade it faster as well. Also, anomaly detection and root cause analysis. This is very important because you can detect issues earlier and solve them faster. And of course, performance optimisation and energy efficiency. That's very important as well for operators.
Tony Poulos, TelecomTV (25:15):
Do you want to add to that, Kyriakos?
Kyriakos Exadaktylos, Vodafone (25:16):
Yes. So AI, I would say there are three kind of different steps or areas. So the first is AI4RAN. This is where we need AI to improve performance of the network. So you go on the radio unit, you introduce AI, you go for all the algorithms you have for beamforming, link adaptation, channel estimation, and energy efficiency is what you need to start improving the customer experience or make the network efficiency much better, right? So that's the first area and this is top priority at the moment. The second area is AI computing. So how we can make the compute that we have now introduced as part of Open RAN to deliver benefits of efficiency in terms of capacity management. So there are very encouraging results at the moment that can deliver, you can go even have 50 to 100% improvement on capacity thanks to AI computing. Yes. So with Intel Acceleration is an example we are testing together in Malaga and it's something that is going to be really exciting to see it on the field, right?
(26:19):
Because when you introduce an architecture of the future, at some point, you need to see the real benefit, right?
(26:25):
And the third area, you mentioned about AI workloads. That's where it comes AI on RAN, what we call, right? So how you can start monetising on the edge, certain applications for industrial applications that we mentioned before, or applications for automotive and GPU as a service. For example, you might not need to have your own GPU. You might share the GPU with other operators, expose APIs and monetise. Yeah. So that's the third area. So as you can see, AI is transforming the whole architecture evolution. And if we go like two, three years back or even five years back, it was like Open RAN versus no Open RAN. Now you see like there is not such a thing because all this evolution of AI and automation next you say that's the architecture of the future.
David Trigg, Dell Technologies (27:17):
You're getting into the, why do we have to do this? Yeah. And right now I'm seeing two significant drivers. One is the speed and acceleration of advancement is unlike anything we've ever ... I mean, think about AI a year ago versus where it's at now in people's businesses and people's lives. And if telcos are going to be able to keep up with the technology innovation, they have to adopt this because doing a G every 10 years just doesn't work. I mean, it's got to be constant, constant, constant. The other thing is, we're going through a fairly significant business cycle in our industry at the point and be able to weather the business cycles and the ups and the downs and manage all of that. The ability to be able to adjust and be nimble and adapt through all of that, the way that networks get built, run, operated, monetised, it has to change because otherwise we're not going to be able to keep pace with what's going on.
Tony Poulos, TelecomTV (28:12):
It sounds like it is changing Kyriakos. And with tangible benefits, tangible benefits that you're already seeing from AI in live networks, then these things like being AI driven, self-optimisation, anomaly detection and energy efficiency, how much more are you going to get benefit from? You mentioned some of the things already, but AI doesn't just happen. It's got to be developed too. That's another thing. Should we make mistakes before-
Kyriakos Exadaktylos, Vodafone (28:37):
You are right because you mentioned the right areas. I mean, AI starts from network operations, fault management, incident management, all of this goes straight away, energy efficiency. I mean, I always say to my team, look, 5% improvement on performance might be nothing for the customer, but 5% on energy efficiency is millions for us.
(29:00):
So that's the area of key focus to get benefits of AI at the moment. But as you say, to create all these agents that you need, you need to transform your internal architecture. So go well and above what is the standard base station and SMO and automation layer, you need a board to have a digital layer where you have your digital agents and assistants of the network that you have a front end and a single user can say, "Tell me the five more consuming sites of your network and why they are consuming and find ways to improve them." So all of this requires behind data quality, data lakes, trusted AI models, and exposing all that to the AI agent layer and that requires huge development, but we are making solid steps there with partners, of course. I'll tell
Cristina Rodriguez, Intel (29:59):
You to add to that. Then you can go to our booth right now. Samsung is demonstrating in our booth AI applied to the radio algorithm and to the infrastructure of the network, and you see double digit benefit in spectrum efficiency and power management. That's right there. Demonstrated. Now. Now. Right there.
Tony Poulos, TelecomTV (30:22):
And there's millions of dollars we're talking about right away. Yes. Cristina, at the edge, which AI use cases such as real time analytics, computer vision or predictive maintenance are you seeing move from pilot to production? And could you share some examples for us?
Cristina Rodriguez, Intel (30:39):
Yeah. So if you look at the edge, I just mentioned one the way. If you look at the ... Let's talk about the RAN first, right? There are two categories from my perspective of cases that are very real right now. One case is the radio algorithm. I was just talking about spectrum efficiency. Samsung is demonstrating that. In the booth, we also have link adaptation, algorithm benefits, AI applied to link adaptation, spectral efficiency, all kinds in general, just channel estimation. The typical radio algorithm that was already super optimised and now they can be incredibly ... They can be better. So that's one area. The other area you mentioned some of it is more in the infrastructure of the network. Power management. We are demonstrating that. Samsung is demonstrating that. Power management, reduction of power management. You take what AI does is you take the capabilities of the silicon, the CPU architecture.
(31:45):
For example, we have something called C-States, which allows to put cores to sleep that are not being used. But that's not the big deal. The big deal is that we can wake them up very fast. So with AI, you see real benefit, the power consumption drops. And
Tony Poulos, TelecomTV (32:05):
Load management
Cristina Rodriguez, Intel (32:05):
Is not
Tony Poulos, TelecomTV (32:06):
A big issue.
Cristina Rodriguez, Intel (32:07):
Yes. You talk about predictive maintenance. We're seeing that. Automation, you were talking about how important having a ... We all want to get to the zero touch automation. We're on the way there with AI. And then if you look ... So that's the RAN. If you look at the edge, the more vertical edge, the retail. We're seeing ports where there's a lot of video because now with AI and the capability of the chips with so many cores, you can do a lot of video analytics. So you are seeing it used in ports and airports. We see it used in retail. All kinds of use cases.
Tony Poulos, TelecomTV (32:47):
This is all great, but I'm going to ask Dave the real difficult question. Dave,
Kyriakos Exadaktylos, Vodafone (32:52):
You're lucky.
Tony Poulos, TelecomTV (32:52):
David, how do you address concerns about data privacy, security, and sovereignty, as we touched on earlier, when deploying AI at the edge of the RAN?
David Trigg, Dell Technologies (33:04):
Well, you've got to be very purposeful. And we take the approach, and Cristina kind of touched on this in her previous comments. You've got to do it at all layers. So we think about it as we build and design the infrastructure, everything from the chip to how we build and design secure supply chain, making sure that we think about it from the ground up. And you get into the software, you get into the deployment, the management. It's got to be built in in every layer of what you do. And then obviously as you extend into the process and the people, which I keep harping on, because that's such a critical component. So you can't come in after the fact and say, okay, well, let's talk about security now. You've got to think about it at every step of the journey and you've got to address all of those.
Tony Poulos, TelecomTV (33:48):
Earlier on, you touched on the effects of the change in the business operation, the management, the people. What new skills and organisational changes will operators need as AI becomes embedded in planning, optimisation, and operations? It's a whole new skillset, isn't
David Trigg, Dell Technologies (34:07):
It? It's a whole new way of operating. You used to have people that would manage a stack that doesn't ... I mean, it still does exist, but now you're going to have people that manage the infrastructure. You're going to have people that are managing the applications. So it becomes much more horizontal, which quite frankly, in the IT side, they've been operating for this for a long time. And then you think about how you do testing and integration. All of that is known in how the ecosystem works together is figured out in the IT space, but in this world, we're still learning. We're still figuring that out. How do we operationalise? So yes, the skillsets are different. How the teams work together is different, what their view of the stack is very different, but there's significant benefits to doing, but it's not easy, but it's worth going through the journey.
Tony Poulos, TelecomTV (34:53):
Well, on that note, I'd like to thank David, Daniel, Cristina, and we now know the whip behind all of this. Kyriakos, thank you so much for being with us today.
Hi, I'm Tony Poulos from TelecomTV, and I'm at MWC 2026 in Barcelona. Today, I'm going to find out a lot more about AI-ready networks and unlocking new workloads. And to help me through all of this, I have four thought leaders from the industry: none other than Kyriakos Exadaktylos, who is the Head of Network Architecture Specifications and Energy Performance at Vodafone. You have a very long title, so it must be a very important job. That one is. Cristina Rodriguez, who's VP and GM of Network and Edge at Intel Corporation. Cristina, welcome. Great to have you on board. We have Daniel Borrás, who is Head of Marketing and Business Development at Samsung. Welcome. Thank you. And last but certainly not least is David Trigg, he is VP Telecom Sales and GTM at Dell Technologies. Now let's kick off. Kyriakos, since you're the closest, I'm going to ask you the first question.
(01:02):
With the anticipated network advancements of 5G Advanced, how critical is it for CSPs to adopt cloud native architecture to support the evolution, firstly, and when do you believe is the right time to make that transition?
Kyriakos Exadaktylos, Vodafone (01:17):
So very good. Good morning, everyone. So 5G Advanced is here now. We don't have to wait for 6G. It comes to provide us multi-service, multi-purpose networks. And on top of the 5G standalone, we can have new capabilities like network slicing, RedCap, multi-mission critical networks. And for all of that, you need an architecture to be ready and future-proof to support these new capabilities and AI and automation. So there is a paradigm shift from hardware-driven metrics to software-driven evolution and automation. And for all of that, you need openness. You need open interfaces, cloud native architecture, and Open RAN.
Tony Poulos, TelecomTV (02:01):
Well, we're hearing a lot about that at this particular event as well. But Daniel, I'm going to flick over to you now. When we talk about networks designed to be open, as Kyriakos just said, and integrated, disaggregated and programmable, wow. How does this approach impact the way they are built, operated, and then upgraded?
Daniel Borrás, Samsung Networks Europe (02:21):
In practice, it means to move towards an end-to-end software-driven network across the radio access, the core and the transport. And basically what this means is that you don't have monolithic systems. The network is built from modular software components that are connected by open interfaces, and that's run on open hardware platforms. So with this, you are running towards a purely software-driven architecture that is future-proof and open.
Tony Poulos, TelecomTV (02:57):
Now for CSPs, David, it's still early in their journey for some of this. What are the first two or three steps that you recommend to modernise their foundation without disrupting current services? It's the critical part, without disrupting their current services.
David Trigg, Dell Technologies (03:13):
Well, that is the critical part and has been a big challenge because, in fact, to your earlier question, how does this impact them? It impacts them in all ways from everything around how they design, build, run, operate, monetise their network. The technology and the technology integration is really hard. And so you've got to start and start somewhere, but equally as hard are all the people changes, the process changes. And that's also really hard because you've got to continue to operate a legacy traditional while you're also learning how to run and operate the new. So what we've seen within telcos is the biggest problem they have is they're afraid to start. So you've got to start somewhere. And I know sometimes if you start with a single stack or a single component in your network, that still feels like a silo or a different kind of silo, but it starts to give you the foundation, the skills, the processes to be able then to expand and beyond.
(04:09):
But you've got to be very purposeful about how you design and operate that because if you do that incorrectly, it can lock you into just a different style of the stack. So you've got to make sure that that design can transport across your network as you start to put additional workloads or different capabilities. And so I always just say start somewhere. We see most telcos start somewhere in the OSS/BSS as it's a little bit more like a traditional data centre workload that's been going through this transformation forever, but then it extends further into the network, into the RAN. The good news is the telcos that have done this, and there's only a handful in the world, they're already seeing the operational benefits. They say, yeah, it was really painful, really hard at first, but now they're like, nope, it's proven. The operational benefits, it's proven.
(04:54):
So we're starting to see that, which is a great thing.
Tony Poulos, TelecomTV (04:56):
Well, the next question's open to all of you. So jump in. Looking at the ecosystem collaboration between Dell, Intel, Vodafone, and Samsung, how does each party contribute and what types of challenges are best solved together rather than individually? Cristina, do you want to start with that
Cristina Rodriguez, Intel (05:15):
One? Yes. Yes. I love that question because I think what is important at this point is that the technology in every different aspect is ready. If you look at the four of us, we have the Intel have this silicon underpinning the architecture ready for what the job that needs to be done. We have all talked about it, you all talk about it. They need to go to a cloud native, they need to go to a software defined network, open interfaces, and start deploying software that is going to have the whole wireless stack that is going to allow us to move at the speed or to move the innovation at the speed of software. Very important right now, be very flexible, very scalable, and be able to adopt all the new technologies. But then you look at us, how do we make that possible? Okay. We have Intel that have the right technology in the semiconductor, in our CPUs and SoCs underpinning that architecture.
(06:17):
You look at Samsung, they have the software running ... Well, before the Samsung, you look at Dell and they have the servers that exactly put together, built for the network. And then you look at Samsung, it has all the wireless stack running on those servers. And then you look at Kyriakos and Vodafone being the leaders in the industry, deploying for real cloud native, open interfaces, Open RAN with fantastic KPIs.
Kyriakos Exadaktylos, Vodafone (06:51):
Open that
Cristina Rodriguez, Intel (06:51):
Up. Real Open RAN without compromising any KPIs. In fact, you guys have talked about how good that network is. So this is where the ecosystem comes together. This is what we have done working together.
Tony Poulos, TelecomTV (07:08):
I think Cristina answered for all of you there. Yes. Everyone has a part to play.
Kyriakos Exadaktylos, Vodafone (07:13):
Exactly. And when we started with Open RAN, one of the most important things we knew since day one was there is one thing different from the traditional RAN with Open RAN that you have to make it right and you can only make it right with collaboration. And that's the system integration work,
(07:30):
Right? So the software release packaging, because we have Dell, we have Wind River, we have Samsung. So how you do this software release packaging, how you align the software release. So we set as a target this year. We said everybody, all parties have to commit to a single server configuration. So we got the Intel Granite Rapids and we said, we start with 40 core, 64 cores, 72 cores. How we are going to put that into the network that every site has a single server, no more than one single server, because that will allow us to be at low TCO, best energy consumption, and better performance for our customers. Well,
Tony Poulos, TelecomTV (08:10):
I was going to come to you on that question, that exact thing in a moment. How do Open RAN deployments compare to traditional RAN? We just touched on that. And what is Vodafone's experience and what's it revealed about the main barriers, whether technical, commercial, organisational and regulatory that hinder more widespread Open RAN adoption from a performance and cost perspective? It's a big question. But additionally, were there any unexpected findings as you rolled out in this area, either positive or otherwise, that stood out during this journey?
Kyriakos Exadaktylos, Vodafone (08:43):
Yes. So if I take the main pillars like performance, now we are deploying in Germany a full commercial network, real Open RAN in the urban environment. And we've seen that our KPIs now, both 4G and 5G are equal or better than the previous conventional RAN, which is a remarkable thing for the industry. That's not an easy part, but it was not from the arm, but we have now achieved. And what we want, we want to be even better. We don't want to stop there because when we introduce 5G Advanced, you're going to be even better to what you had before, not equal, right? Yep. So on the performance things are moving really, really good. And we will go to the first city now before the summer, okay?
(09:26):
Yes. Second thing you mentioned about cost. Yes. Of course, you have to match the right cost. Then what we did, the global tender, we've awarded strategic suppliers and there was of course economies of scale. You need volumes in order to create the right TCO and the right price. So we achieved a very competitive pricing for Open RAN. And now what it comes in the cost optimisation, we see how we can do things like server configuration, multi-band radio from Samsung. So you can have several bands into one single radio, and that will allow you to be far more efficient. But what will make you even more is the automation part.
Cristina Rodriguez, Intel (10:08):
The
Kyriakos Exadaktylos, Vodafone (10:08):
Thing with SMO, then Open RAN networks is that it will allow you to deploy at scale in a much more operationally efficient way. And this automation thing is the new thing that will allow you to deploy applications for energy saving, performance, and that's the new architecture. We have to take now the benefits from the new architecture.
Tony Poulos, TelecomTV (10:29):
Oh, for Cristina, this time, how is the open ecosystem with vendors, integrators, and operators working together to improve interoperability and simplify integration? I mean, you've got to get together and agree on what you're going to do,
Cristina Rodriguez, Intel (10:43):
Right? And we always have said that. It's the collaboration in the industry and the collaboration between the partners and the ecosystem, super important. It helps us, and this is important, it helps us that we're basing this architecture on an open platform, common platform, based on a general purpose processor, our Xeon products, and it's a platform that everybody knows. We have millions of software engineers out there that understand that platform. We have been in this industry for quite a bit. We have learned a lot. We have been very intentional in our roadmap for a few generations, and we have been again on this for several years. Every generation we have doubled even the performance per watt of our chips, and we double the performance within the same power envelope or lower. And as Kyriakos was saying, now we have come to Granite Rapids. This is our latest generation, CN6 generation, all the way to 72 cores.
(11:54):
But again, we have that platform working with Intel with Dell. We have worked for many years for Dell. We are one of the biggest partners in the world. We collaborate on what is the best design? How can we have the best power consumption? How can we have multiple SKUs that satisfy what Kyriakos was trying to do? Same thing with Samsung. We work very closely together. How do we optimise the software? How do we optimise memory? And this is collaboration. We talk every week. Our teams talk every day. We're working very closely. Understanding Kyriakos's requirements, of course, of what he's trying to do in the network, very important. So the fundamental part is that close collaboration
David Trigg, Dell Technologies (12:42):
Between us. And I think one of the most important things is, and we're very lucky to have industry leading customers like Vodafone that quite frankly is pushing us, right? And because without that, there's sometimes they're like, "You guys aren't working well together." And it kind of forces us, somebody that's willing to go out there and be industry leading because what we have to deliver hasn't changed, right? And we have to deliver high quality service to a customer, to their end customer, but how we're doing it changes ... Everything else changes, right? And so we're very fortunate to have an industry leading customer pushing us and then we're figuring it out and learning and then learning and learning. And it's been a great collaboration.
Tony Poulos, TelecomTV (13:23):
So I wanted to find out who was the driver. Who is the driver? Is it the customer? Is it the end user? You just answered that very well.
Daniel Borrás, Samsung Networks Europe (13:31):
It's very important to have a central coordination, like an orchestra conductor, to make everyone know exactly what they need to deliver, because it's a very complex ecosystem and projects and very fast speed like Germany, where we have to introduce the solutions very quickly. So this is extremely important.
Cristina Rodriguez, Intel (13:47):
And you were saying who's driving it? The industry at some point, I think all of us realised at some point that we needed to do things differently, that a purpose built, handcrafted or specifically done for the network was not going to scale. It wasn't flexible enough. It wasn't at scale. It wasn't future proof. We're talking about ... And even at the moment that we are ... And this wasn't ... We were not talking about AI years ago, but we knew that the point was going to come where we needed something that was more flexible, more open. And now I tell you this, I can't see AI being deployed fully to the full potential without a software defined open network. So the industry realised that. I think all of that got to a point that, okay, we need to make this happen, but I would say Vodafone has been at the forefront of the leadership of the industry.
Tony Poulos, TelecomTV (14:49):
You're not only the leader of the group, but now they're saying that you're maybe the-
Kyriakos Exadaktylos, Vodafone (14:55):
Leaders of the industry. No question. Access to make the noise.
Tony Poulos, TelecomTV (14:58):
Well, look, I'm impressed. But I'm going to ask you now, as networks do evolve, how do you see the shift from connectivity only? Because we always think of networks as just being the connectivity part, but we're moving from a connectivity only to a platform for services playing out, ideally leading to edge applications with the answer, I suppose. Where do you see that going and how critical is the new environment to do that?
Kyriakos Exadaktylos, Vodafone (15:22):
Yes. As you say, things are moving from basic connectivity to value added. So with 5G Advanced now, we can have multi-services, multi-purpose networks that we can deliver new service enablers on top, right? And that requires cloud native network software driven and with network APIs where you can expose certain capabilities. So you have quality of service, latency, ultra reliability, all of these becoming now exposed and consumed from applications. And this is where then you mentioned about edge, edge applications coming. The AI, as Cristina mentioned, it starts now from the vRAN. We need AI into the cell site. Why? Because we need to improve performance. That's the starting point, right? So you need that to put on the base stations, on the radio units to improve channel estimation, link adaptation and all of that. But when you go on the edge, AI will start introducing value for monetisation.
(16:20):
And monetisation, you can start from exploring certain verified capabilities like AI driven performance or optimisation, energy efficiency, multi-agent orchestration or all these capabilities you can start monetising on the edge, and that's the new thing. Now, you might need definitely new architecture there that is much more efficient at the low cost, but this is what will allow us to have also new revenue streams.
Tony Poulos, TelecomTV (16:53):
Look, talking about edge applications, Cristina, which edge applications, for example, XR, industrial automation, private 5G, smart cities, all the things we keep hearing about, are they gaining traction today and why?
Cristina Rodriguez, Intel (17:07):
Absolutely. Absolutely. And I'm going to pick up or continue, build up on what Kyriakos was saying, because this is how progress happens, right? Years ago, we look at what hyperscalers were doing and we said, "Hey, there's economy of scale by doing this kind of deployments on a server general purpose processor kind of architecture." So we took that and we focused on the core of the network first. Now it's practically completely highly virtualised. And then we said, "Hey, this makes sense also for the RAN." And we started to deploy on the RAN and make that happen, that virtualisation of the RAN. But then that very same architecture now comes to edge. Now this architecture has the capability and we're bringing this into the table to also do AI inference. We're in a moment where edge is important, we want to process the data where it is generated.
(18:08):
We have the AI capability now to do quite a bit of inference right there on the edge because we have that built-in AI within our CPUs. So you have a server that can do AI without any additional component. And now you have this box that you can put in anywhere in the edge, anywhere, any vertical that you were saying, industrial, retail, ports, you name it, right? Manufacturing floors. And now you have this box that can combine connectivity, network functions, security, wireless. And this is what we have done working with Dell. We have that platform that can be delivered on-premises for enterprise. And yes, tons of traction we have here, if you get a chance to visit our booth, we have an edge of a couple of demos on AI and all kind of things, again, in all those verticals.
David Trigg, Dell Technologies (19:07):
One of the things I think is critical in this with the onset of AI, the thing that's become really, really, really important is data, right? And where does data live traditionally and where is it being generated? It is at the edge, right? And it's too hard, too difficult, and too slow to move the data.
(19:27):
You've got to put the compute, you've got to put the processing, you've got to put the power where the data is, and it's going to force us to do this. And AI is really going to change the game and also the demands of the network, because if you think about it, the number of people with mobile phones has been relatively flat, right? And so that hasn't really changed a lot. But as Agentic AI and as agents, those are basically workers who, A, don't take a break, don't sleep. The demand and the number of users that are going to be on a network is going to grow exponentially again. And so the demands of the network, where the data lives, this changes everything. Well,
Tony Poulos, TelecomTV (20:05):
You're leading onto the next question. I was actually going to ... There's something ahead of me here because I was going to say, can you talk about the opportunities where the development of running enterprise workloads at the edge or on-premises can deliver benefits in terms of latency, reliability and data sovereignty, that's going to come in too.
David Trigg, Dell Technologies (20:22):
Yeah. Yeah. I think data sovereignty, it's always been a discussion, but given the world environment that we're all living in and the dynamics that we're all living, it's become critically important to making sure, again, back to my comment about data, where does the data live? How do you protect the data? What's the security? How do you manage some of the complexities of the world that we're living in? And you can't do that unless you really think about where is that data going? Where's it living? How is it being transported? And how are you using it to better people's lives, to make quicker decisions, to advance businesses? And so it's become critical.
Tony Poulos, TelecomTV (21:01):
I'm going to come to you in a moment, Daniel. Don't worry. I'm not leaving you out at all, but Cristina, how do CSPs balance the need for low latency and local processing with the economics of a centralised cloud? With Dell, to add to the commentary here as well, I'm going to ask Dell to roll into this one as well. But the question is, how do CSPs balance the need for low latency and local processing with the economics of a centralised cloud?
Cristina Rodriguez, Intel (21:26):
But that is the beauty of the architecture and having a common platform that goes from the cloud, the core, the RAN, the edge. So starts there. You want to maximise, you want to maximise the capabilities that you have. You need low latency. There are applications that you need to process at the edge because they require low latency. You can't send the data all the way to the data centre and back. So it's an architecture of your solution, right? So if you need low latency, you do it at the edge. And we have that capability and the software exists to do that. If there are the cases where you can afford to do more processing at the data centre, then you do that. But it's that flexibility and having the same architecture, let's say common platform and architecture end to end gives you that-
Tony Poulos, TelecomTV (22:21):
Flexibility. The critical data. Do you want to add to that, Daniel?
Daniel Borrás, Samsung Networks Europe (22:25):
Yes. Actually, as Kyriakos was mentioning, so we have one server per site. This is very important. This is the edge. And actually those servers, because it's a software driven architecture, they are already ready for AI. So basically we are going to be adding AI features to improve performance, especially to those servers. And this is going to be possible on the servers that we are currently deploying.
David Trigg, Dell Technologies (22:48):
Cristina,
Tony Poulos, TelecomTV (22:48):
Again, I'm going to ... I'm sorry.
David Trigg, Dell Technologies (22:49):
Sorry, but just one other thing on the flexibility point. Obviously compute computers where you put servers, like the stuff we have behind us, it's a very physical nature, right? And so it's really challenged us to really think about where and how are we deploying these things. So we've had to design, ruggedise. So whether you're putting it on a pole, in a manhole, wherever you're putting the compute, you're really forced to think about every different aspect, including the chipset, how's the software run, the temperature ranges. It really changes the game because we're used to most of the compute going into a sterile, raised floor, temperature controlled environment, right? And now we're having to deploy it everywhere.
Tony Poulos, TelecomTV (23:29):
Yeah. Well, Cristina, for partners and the ISVs, what's the best way to onboard and scale applications on a telco grade edge platform?
Cristina Rodriguez, Intel (23:39):
Yes. Yeah. It's
Tony Poulos, TelecomTV (23:40):
Okay talking about all this, but how do you actually do
Cristina Rodriguez, Intel (23:42):
It? Yes, yes. And Intel is, we're big on ecosystem, right? We have a very rich ecosystem and we have been working very closely with Dell in creating this catalogue where all the ISVs can come, they can port their applications, we help them with that, and then they become part of that catalogue. So every potential customer can go there and see, okay, this is the application that exists, this is what I need, this is what I need and choose from a selection. But again, we work very closely with ecosystem. We work very closely with the application. They know the platform. They have been programming for that platform forever. It's again, that collaboration.
Tony Poulos, TelecomTV (24:32):
This question's for both Kyriakos and Daniel, but I'm going to ask Daniel first. When we talk about AI in the network, what are the most important use cases today for vRAN and AI RAN? I know he's going to say a lot on this, but I'm letting
Daniel Borrás, Samsung Networks Europe (24:44):
You
Tony Poulos, TelecomTV (24:45):
Go-
Daniel Borrás, Samsung Networks Europe (24:45):
That's a good question. Yeah. Actually, I mean, there is a lot of progress right now. Things are moving extremely fast, but we see right now real value in deployment automation. So this is the first thing you do. You deploy the network, so you can deploy it faster and upgrade it faster as well. Also, anomaly detection and root cause analysis. This is very important because you can detect issues earlier and solve them faster. And of course, performance optimisation and energy efficiency. That's very important as well for operators.
Tony Poulos, TelecomTV (25:15):
Do you want to add to that, Kyriakos?
Kyriakos Exadaktylos, Vodafone (25:16):
Yes. So AI, I would say there are three kind of different steps or areas. So the first is AI4RAN. This is where we need AI to improve performance of the network. So you go on the radio unit, you introduce AI, you go for all the algorithms you have for beamforming, link adaptation, channel estimation, and energy efficiency is what you need to start improving the customer experience or make the network efficiency much better, right? So that's the first area and this is top priority at the moment. The second area is AI computing. So how we can make the compute that we have now introduced as part of Open RAN to deliver benefits of efficiency in terms of capacity management. So there are very encouraging results at the moment that can deliver, you can go even have 50 to 100% improvement on capacity thanks to AI computing. Yes. So with Intel Acceleration is an example we are testing together in Malaga and it's something that is going to be really exciting to see it on the field, right?
(26:19):
Because when you introduce an architecture of the future, at some point, you need to see the real benefit, right?
(26:25):
And the third area, you mentioned about AI workloads. That's where it comes AI on RAN, what we call, right? So how you can start monetising on the edge, certain applications for industrial applications that we mentioned before, or applications for automotive and GPU as a service. For example, you might not need to have your own GPU. You might share the GPU with other operators, expose APIs and monetise. Yeah. So that's the third area. So as you can see, AI is transforming the whole architecture evolution. And if we go like two, three years back or even five years back, it was like Open RAN versus no Open RAN. Now you see like there is not such a thing because all this evolution of AI and automation next you say that's the architecture of the future.
David Trigg, Dell Technologies (27:17):
You're getting into the, why do we have to do this? Yeah. And right now I'm seeing two significant drivers. One is the speed and acceleration of advancement is unlike anything we've ever ... I mean, think about AI a year ago versus where it's at now in people's businesses and people's lives. And if telcos are going to be able to keep up with the technology innovation, they have to adopt this because doing a G every 10 years just doesn't work. I mean, it's got to be constant, constant, constant. The other thing is, we're going through a fairly significant business cycle in our industry at the point and be able to weather the business cycles and the ups and the downs and manage all of that. The ability to be able to adjust and be nimble and adapt through all of that, the way that networks get built, run, operated, monetised, it has to change because otherwise we're not going to be able to keep pace with what's going on.
Tony Poulos, TelecomTV (28:12):
It sounds like it is changing Kyriakos. And with tangible benefits, tangible benefits that you're already seeing from AI in live networks, then these things like being AI driven, self-optimisation, anomaly detection and energy efficiency, how much more are you going to get benefit from? You mentioned some of the things already, but AI doesn't just happen. It's got to be developed too. That's another thing. Should we make mistakes before-
Kyriakos Exadaktylos, Vodafone (28:37):
You are right because you mentioned the right areas. I mean, AI starts from network operations, fault management, incident management, all of this goes straight away, energy efficiency. I mean, I always say to my team, look, 5% improvement on performance might be nothing for the customer, but 5% on energy efficiency is millions for us.
(29:00):
So that's the area of key focus to get benefits of AI at the moment. But as you say, to create all these agents that you need, you need to transform your internal architecture. So go well and above what is the standard base station and SMO and automation layer, you need a board to have a digital layer where you have your digital agents and assistants of the network that you have a front end and a single user can say, "Tell me the five more consuming sites of your network and why they are consuming and find ways to improve them." So all of this requires behind data quality, data lakes, trusted AI models, and exposing all that to the AI agent layer and that requires huge development, but we are making solid steps there with partners, of course. I'll tell
Cristina Rodriguez, Intel (29:59):
You to add to that. Then you can go to our booth right now. Samsung is demonstrating in our booth AI applied to the radio algorithm and to the infrastructure of the network, and you see double digit benefit in spectrum efficiency and power management. That's right there. Demonstrated. Now. Now. Right there.
Tony Poulos, TelecomTV (30:22):
And there's millions of dollars we're talking about right away. Yes. Cristina, at the edge, which AI use cases such as real time analytics, computer vision or predictive maintenance are you seeing move from pilot to production? And could you share some examples for us?
Cristina Rodriguez, Intel (30:39):
Yeah. So if you look at the edge, I just mentioned one the way. If you look at the ... Let's talk about the RAN first, right? There are two categories from my perspective of cases that are very real right now. One case is the radio algorithm. I was just talking about spectrum efficiency. Samsung is demonstrating that. In the booth, we also have link adaptation, algorithm benefits, AI applied to link adaptation, spectral efficiency, all kinds in general, just channel estimation. The typical radio algorithm that was already super optimised and now they can be incredibly ... They can be better. So that's one area. The other area you mentioned some of it is more in the infrastructure of the network. Power management. We are demonstrating that. Samsung is demonstrating that. Power management, reduction of power management. You take what AI does is you take the capabilities of the silicon, the CPU architecture.
(31:45):
For example, we have something called C-States, which allows to put cores to sleep that are not being used. But that's not the big deal. The big deal is that we can wake them up very fast. So with AI, you see real benefit, the power consumption drops. And
Tony Poulos, TelecomTV (32:05):
Load management
Cristina Rodriguez, Intel (32:05):
Is not
Tony Poulos, TelecomTV (32:06):
A big issue.
Cristina Rodriguez, Intel (32:07):
Yes. You talk about predictive maintenance. We're seeing that. Automation, you were talking about how important having a ... We all want to get to the zero touch automation. We're on the way there with AI. And then if you look ... So that's the RAN. If you look at the edge, the more vertical edge, the retail. We're seeing ports where there's a lot of video because now with AI and the capability of the chips with so many cores, you can do a lot of video analytics. So you are seeing it used in ports and airports. We see it used in retail. All kinds of use cases.
Tony Poulos, TelecomTV (32:47):
This is all great, but I'm going to ask Dave the real difficult question. Dave,
Kyriakos Exadaktylos, Vodafone (32:52):
You're lucky.
Tony Poulos, TelecomTV (32:52):
David, how do you address concerns about data privacy, security, and sovereignty, as we touched on earlier, when deploying AI at the edge of the RAN?
David Trigg, Dell Technologies (33:04):
Well, you've got to be very purposeful. And we take the approach, and Cristina kind of touched on this in her previous comments. You've got to do it at all layers. So we think about it as we build and design the infrastructure, everything from the chip to how we build and design secure supply chain, making sure that we think about it from the ground up. And you get into the software, you get into the deployment, the management. It's got to be built in in every layer of what you do. And then obviously as you extend into the process and the people, which I keep harping on, because that's such a critical component. So you can't come in after the fact and say, okay, well, let's talk about security now. You've got to think about it at every step of the journey and you've got to address all of those.
Tony Poulos, TelecomTV (33:48):
Earlier on, you touched on the effects of the change in the business operation, the management, the people. What new skills and organisational changes will operators need as AI becomes embedded in planning, optimisation, and operations? It's a whole new skillset, isn't
David Trigg, Dell Technologies (34:07):
It? It's a whole new way of operating. You used to have people that would manage a stack that doesn't ... I mean, it still does exist, but now you're going to have people that manage the infrastructure. You're going to have people that are managing the applications. So it becomes much more horizontal, which quite frankly, in the IT side, they've been operating for this for a long time. And then you think about how you do testing and integration. All of that is known in how the ecosystem works together is figured out in the IT space, but in this world, we're still learning. We're still figuring that out. How do we operationalise? So yes, the skillsets are different. How the teams work together is different, what their view of the stack is very different, but there's significant benefits to doing, but it's not easy, but it's worth going through the journey.
Tony Poulos, TelecomTV (34:53):
Well, on that note, I'd like to thank David, Daniel, Cristina, and we now know the whip behind all of this. Kyriakos, thank you so much for being with us today.
Please note that video transcripts are provided for reference only – content may vary from the published video or contain inaccuracies.
Panel Discussion
At MWC26, industry leaders from Vodafone, Intel, Dell, and Samsung discuss the evolution towards AI-ready, cloud-native and Open RAN networks.
Featuring:
- Cristina Rodriguez, VP and GM, Network and Edge, Intel Corporation
- Daniel Borrás, Head of Marketing and Business Strategy, Samsung Networks Europe
- David Trigg, VP Telecom Sales and GTM, Dell Technologies
- Kyriakos Exadaktylos, Head of Network Architecture & Open RAN, Vodafone
Recorded March 2026
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.