Supporting AI and the emergence of AI factories

To embed our video on your website copy and paste the code below:

<iframe src="https://www.youtube.com/embed/g-xReZ6PNLw?modestbranding=1&rel=0" width="970" height="546" frameborder="0" scrolling="auto" allowfullscreen></iframe>
Guy Daniels, TelecomTV (00:23):
Hello, you're watching the Next-Gen Telco Infra Summit part of our year-Round DSP Leaders Coverage. I'm Guy Daniels and today's discussion looks at supporting AI and the emergence of AI factories. Well, how can telcos capitalize on the surge in demand for AI by leveraging their infrastructure? And how will the emergence of so-called AI factories, essentially GPU centric data centers, how will they impact telcos? Well, let's try and find out, and joining me now on the program are Diego Lopez, senior technology expert at Telefonica and an ETSI fellow, Francis Haysom principal analyst with Appledore Research, and Susan James, one of our DSP leaders councillors. Hello everyone. Really good to see you all again. Thanks so much for taking part in this discussion and let me start right away by asking you how you all think the rapid rise in AI usage impacts the demands of telecoms networks and infrastructure. What challenges can large scale AI workloads cause, and Francis, let me come across for your views first if I may.

Francis Haysom, Appledore Research (01:43):
I think that the important thing is to say it's going to affect in a lot of different ways. They're very different demands on different parts of the network in different parts of the infrastructure. I think it's important to make sure that they are clearly separated. I've just come back from a conference where one of the presenters was using one of these new intelligent glasses devices. One of the key things he said was that, and I think this is the important thing from a consumer perspective and an enterprise perspective, is the impact on the access network. Those, for example, an SX glasses is likely to cause something like a hundred times upstream versus downstream traffic, which is fundamentally different than most access networks are set for. It's very bursty traffic, it's not continual traffic and it's quite latency centric, which is completely different than really net access networks that are really currently being tuned for videos, video streaming.

(02:41):
The second area I think that is important is a lot of the networking discussion at the moment is really about the network, the networking within the data center and the ability to support large language models and deep learning more generally. So again, there's some very different in which the winners in terms of data centers will be delivering their network within the data center. From a telco perspective, increasingly you may well see distributed models and so some of the aspects of the internode communication will need to be supported by the wider networks. There will be a number of issues to do with dealing with sovereign data and where you source data, how you limit data and how you place data which will make a big impact on the infrastructure and where you need to be locating data. And I think there's a more general thing which is about the overall AI is fundamentally about data. Lots of it increasingly coming from a variety of sources all over the place. So again, that will have a huge impact in both in terms of just coping with the volume of data coming from the edge to the edge to the center or to even to the parts of the edge and also the means by which you collate and compress that data or process that at the edge again makes a big difference in terms of network loads. So lots of opportunity for telco there to make a difference.

Guy Daniels, TelecomTV (04:16):
Great. Thanks very much Francis. And we'll cover some of the opportunities for telcos in just a moment, but I'd like to go across to Diego. Francis mentioned AI is very much about data and huge volumes of data here. What challenges might you see these large scale AI workloads causing

Diego Lopez, Telefonica & ETSI (04:36):
Right now? I think that the important thing is to consider that right now what we are witnessing is that the AI models are highly centralized. They're very much dependent on the centralized, I mean service like centralized. I'm not that sure whether the big AI providers are doing it in a single data center or several data centers, but the kind of traffic is quite horizontal across them. The data they're sharing and the data they're accessing is happening somehow behind the scenes. What can be a breakthrough for corporate operations would be if the idea of privacy aware distributed federated DI centric AI takes off and becomes commonplace because that would imply a much more diverse or set of diverse traffic patterns and very, very, very intensive in data. And I agree with Francis in that we have been working on optimizing our networks for some kind of a steady traffic patterns associated with streaming and the general multimedia access. And here we're talking about something that, well, one way or another is intelligence and we'll somehow mimic how the science tells us that our brains work. That is just look big sparkles here and there that appear and disappear. We have to rethink in that case, in the case that we go in that direction and model a distributed model, a really AI distributed model consolidates, we have to rethink seriously, seriously on how we plan for this kind of traffic patterns.

Guy Daniels, TelecomTV (06:37):
Thanks very much, Diego. It's going to be very interesting to see this move from more centralized to decentralized over the months ahead. Susan, let's come across to you. What are your thoughts here?

Susan James, DSP Councillor (06:47):
Yeah, I just wanted to tap into both what Diego and Francis was saying there. We talk about this in that the networks will change and that implies we need to invest more in those networks. And I think the really big challenge is how is it going to be paid for? Because if you're willing to pay for the glasses and you want to actually have them perform really well, but are you really willing to pay more for your connectivity, your increasing demand significantly on that connectivity? I'm not sure that translates into yes, I'm willing to pay an extra amount for my broadband connectivity or for my mobile data. And I think that is where the big challenge is in this. Where does the money come for financing these networks? So I think the applications for these early use cases that are deployed that are going to require lots of data are very early. But I think the challenge will be how is that going to translate into revenues for telcos?

Guy Daniels, TelecomTV (07:48):
Thanks very much, Susan. Well let me ask a related question then. What can telcos do to capitalize on this demand? Diego, can I come across to you first? How can they leverage their infra and especially edge architecture to benefit commercially?

Diego Lopez, Telefonica & ETSI (08:03):
I think that we had to address to go a little bit beyond the idea that we are here just to move data from one point to another and start thinking that what we had to do is to facilitate the exchange of knowledge of data that is well structure and is according to the way in which they're going to be consumed and produced. We have been working on these concepts around talking about data as a product, something that you can package or in the appropriate sizes and in the appropriate styles to be consumed, unannounced them from the different sources te those sources. And when thinking about source of data, we always think about a source of data is a sensor or something that is monitoring something in an environment in which what you have is a distributed ai, a source of data can be another. AI can be a combination of ai, a consumer of data can be an AI or can be even an actuator that is applying to the real world the decision of ai.

(09:16):
So this combination of producers and consumers of data and of elaborated data that normally, I mean we have started to refer it as knowledge that probably is a very big word. But at the end is what we're talking about is that it's not about the raw data's about what you know from the data and how you have shaped or reshaped the data to have more effect is something that is very important that implies that is defining or inventing new services are associated with this. So you can describe, I'm a source of data. I'm going to describe how or which are the data that I'm going to produce, how I'm going to make them available under which conditions. I'm a consumer of data, I will tell what I'm willing to consume under which conditions, including the economic relationships of how much I'm willing to pay, how much I'm willing to ask you to pay me for those data and the knowledge associated and anything that has to deal with how you validate the sources of data, how you establish freshness, accuracy, and for sure how you facilitate discovery. I think there is a range of services and taking into account that we're not talking about the telcos as data sources of data consumers that's for sure is another idea. But here that we will be verticals on these knowledge infrastructures. The idea of facilitating this and doing it within the network as a public service of the public network I think is a very interesting opportunity and this is something we are starting to make some experiments with that are quite promising.

Guy Daniels, TelecomTV (11:12):
That's good to hear. There's certainly some very interesting new service opportunities coming up here. We'll go to Francis in a moment, but Susan, I want to come back to you and get your views about how you see telcos can actually leverage their infrastructure to capitalize.

Susan James, DSP Councillor (11:26):
So I think the first thing we need to distinguish is the AI factories are really there for the developing of the models and then the training of those models when they're running, they're actually not likely to be running in those centralized AI factories because you don't need the same level of compute, the GPUs, the massive training models. So once you get into deployment, then I think it becomes more interesting. So then you're talking about how are either businesses or consumers interacting with those models. So then I think there is an opportunity there and I think that comes from working more towards the providers, whether they be the big cloud providers that are actually providing a lot of the tools or the big data centers that are actually being built to actually support these models going forward. I think that ecosystem needs to work because it will be those companies that produce those applications that they want provide their customers with a great service and that will be delivered then over the networks, whether it again be over fiber, whether it be over mobile networks.

(12:38):
So I think we need to look at that ecosystem play and where does those workloads need to be deployed and I think the telcos have a lot of assets there from the size of their networks over the years have shrunk enormously in terms of the infrastructure, the size of the infrastructure that they need to be able to provide. So they do have physical assets around that has spare capacity, that has great connectivity, that has great power. So I think there is a lot of assets out there from a physical perspective that could then support these workloads. So I think it's looking at when they look at what is their business, I think they need to have a broader look at it to say, okay, I'm not just about providing connectivity to consumers and businesses, maybe I'm also providing more infrastructure for the providers of other applications as well. So I think there's a lot more to be thought through in the process of working out what can telcos add here.

Guy Daniels, TelecomTV (13:44):
Yeah, sure. Thank you Susan. As we keep saying, it's like what is your business? What is it you are providing, Francis, let's come across to you and get your thoughts.

Francis Haysom, Appledore Research (13:53):
I think that's a very important point is what is the business problem we're trying to solve? My worry in this question particularly is it feels a bit like telco edge 2.0, which is, but we now as AI as the killer application for the use of edge, the existing edge infrastructure that can somehow magically be used. One of the things we identified five years back in our Edge telco edge cloud opportunity paper was that a lot of what enterprises are needing or partners are needing from an edge, from a telco edge is not about the technology, it's not even about the infrastructure. It's about softer things like your ability to scale, your ability to be more reliable, your ability to provide less with less support cost to a customer. Because the alternatives in all of these areas, whether it's existing edge or in terms of new AI loads is the alternative, is the enterprise does it themselves in their own data center or the load is offloaded to the major hyperscalers or other data centers that are providing specialist capability for ai. So telcos need really, I think the important thing here is telcos need to start with what is the commercial problem I'm solving and what is the commercial model for me solving it that makes people want to buy it from me.

Guy Daniels, TelecomTV (15:22):
Yeah, absolutely. Thank you Francis and Francis, if I can stay with you for our next question because you nicely brought up Telco Edge 2.0. There is this current situation we in, is it an opportunity for telcos to rethink not just the edge but the placement of their data centers and metro locations to more cost-effectively support all the processing and connectivity requirements that we may see more of as society ups? Its use of ai?

Francis Haysom, Appledore Research (15:52):
I think very, very simply is yes, it is an opportunity for telcos to rethink their models and where they're placing data centers, but I will again come back to the key point here. What is the commercial problem you as the telco are trying to solve? What are the alternatives for somebody else doing something? And usually there's a lot of other options to deliver the same capability or to disaggregate it and for others to put the same together. So I think there's definite things in architecture can solved there, but if you start with the commercial model and Susan made a very key point is who's going to invest in this network, who's going to create that connectivity? I think some extent the telcos need to turn it around, which is to say, okay, for me to create this opportunity, I'm going to have to invest, I'm going to have to massively invest, I dunno, 10 times the upstream capability for example, to deliver it, but I need a business model that supports that, be it the glasses manufacturers or whoever else it is that is using it is what is the synergy? Where is the money flowing that is allowing me to invest and likely get a return on my investment in data edge wherever it is or in terms of increased connectivity or increased capacity in different directions.

Guy Daniels, TelecomTV (17:23):
Sure. Great. Thanks very much Francis. And let's come across to Diego and Diego. Let's get your views on whether or not this is an opportunity to rethink architectures and placements of data centers.

Diego Lopez, Telefonica & ETSI (17:35):
I think that, well it's not only about the kind of services of the commercial opportunities. It's as well that well we are and telco operators are expecting to be heavy users of AI technologies. So it is not only about that what we are serving, but even for the let's say traditional service, it will be a natural way. The use of closer or well tighter, more effective close loops for many daily operations would imply a much better operation of the network that would hopefully will allow us to reduce costs and find a more well better organization in how we deploy and operate of infrastructure. That would translate for sure in changes in the architecture, not necessarily because we need to attend a different demand. That could be the case for sure, but simply because we will operate the usual mechanisms that would include for sure, I mean I don't believe that just because you have ai, you are going to stop, call your mom during the weekends or things like that.

(18:56):
So these normal business as usual would be as well influenced the use of AI internally. This is one thing. The other thing is that what we can expect that is the number of devices of end devices that will have some smartness inside is going to grow. And that would imply as well a more efficient use of the network including something that right now it's a hot topic that is about how much energy you spend and how much energy you're required to be to running those services. From our side, that's part of the operations and from the side of the users because we expect that they will end having smart devices that will be able to somehow interact from the network learn, which is the way in which they can make this in a more efficient way in terms of energy or even in terms of cost.

Guy Daniels, TelecomTV (19:59):
Yeah, good points Diego, thank you very much indeed. And Susan, what are your thoughts here?

Susan James, DSP Councillor (20:05):
So I think it's important to understand the AI factories themselves are going to be relatively new data centers. And the reason for that is that the GPUs that they depend on, well they require GPUs because AI workload in that development and training phase, which is where we put them in AI factories require about 10 times more compute than your average data center.

(20:32):
When you have 10 times more compute in A GPU, you have much more power density. When you have more power density, you have much more heat. So you need to provide a lot more power, you need to dissipate a lot more heat. So the requirements are on those data centers are much more than the existing datas typically can cope with. So you need to build new data centers for them to be able to do that. And typically you're going to want to put them in cold places so that you can take advantage of free cooling or you can use heat exchange.

(21:05):
You want relatively green energy, you want surplus of energy, which is not that very many places that you can do that right now and you want to have great connectivity. So the requirements for these AI factories are going to dictate to a large extent where they're going to sit and depending on whether in Europe or where you are, there's going to be certain areas that are going to be better fitted for those type of workloads. But of course when you're doing that and you then are deploying them, you're going to go to put 'em in either clouds or different things and they take up a lot of space and a lot of compute, the demand for data centers is not going to go down in the foreseeable future. So the data centers that are out there, the prices for those prime real estates are going to go up.

(21:54):
So one of the things I see as driving to the edge is not so much a pull to the edge is what I call is a push to the edge that those high value data centers, if your workload doesn't need to be there, you are going to push it out further into a tier two data center or somewhere else that's not as expensive. So I think we initially thought of those edge data centers as being the highly prized, you're going to be low latency, all these sorts of things that may come, but I think the push out of those high value centralized data centers is actually going to be one of the drivers of pushing those workloads out. And then I think it's going to be the overall efficiency that you get with scale. So those new big data centers have a power efficiency of around one point something. The really efficient ones are a 1.04, your average data center is up around 1.7 and those edge data centers just simply don't have the scale. So you're not going to get the same efficiency of power utilization going directly into the it. You're going to have a higher overhead for cooling for all of the running of the ancillary things around the data center. So I think there are other factors that are going to influence where the big workloads are going to go and they're going to distribute others out further into the network.

Guy Daniels, TelecomTV (23:21):
Yeah, thanks Susan. It's really interesting to consider this distribution of workloads and the factors involved. And Francis, you'd like to come in on this point as well, I believe.

Francis Haysom, Appledore Research (23:31):
Yeah, I'd like to build on what Susan was saying and it may sound counter to it. I think we just need to recognize an AI factory is a data center. It may be a new data center, it may have different computing, it may have different storage needs, it may have different network systems, but a lot of what will make it a successful AI factory is not just that simply it is the operational efficiency of it and there's both the operational efficiency in terms of NNG versus is energy consumed. But there's also the whole aspects of the operational efficiency and Susan mentioned scale as a key thing.

(24:14):
One of the things that I think will make a big difference to the success of an AI factory and if we're specifically talking about telcos here is how do you gain those sense that scale, that operational scale in your data centers and in your edge data centers. It may not meet the exact thing of the centralized high large scale systems, but that ability to scale to be able to operationally automate will also be a key part in terms of making AI factories success at the edge or more importantly, who is the successful one at providing those type of factories that at the edge.

Guy Daniels, TelecomTV (24:57):
Great. Thank you very much Francis and Diego. We'll come across to you for another comment.

Diego Lopez, Telefonica & ETSI (25:04):
Yeah, no, well, I would like to insist on the idea of the importance of the evolution towards real or completely distributed today. I think that's something, there are several drivers for it. One is the, well, the need for keeping a locality of data. I'm confident or confident because I guess it's in the interest well of us as network providers and of the as society at large that people can realize that well this idea of that data is the new oil in these days and that well retaining your data, retaining the value that the data that you have and trading with it why not is something that is important. And that implies that for sure those AI factories, highly centralized facilities that can run complex models and can crunch enormous amounts of data are important and that's going to be there. But that for them to work, for them to work in an environment in which people and organizations are becoming more aware of the value of the data they are generating and the data that they hold is something that will be require a much, I mean the agility, the operational efficiency that Francis was mentioning will be as well connected with the capacity of operating in the network cloud, continuing in a way that put a low consumption of data or consumption of integrated data in a way that will warranty the rights of the wills of data providers as well as of the big models.

Guy Daniels, TelecomTV (27:03):
Yeah, thank you very much Diego, and thanks to all of you for your comments on that question. And I think there is room here for another question in the interests of clarity because there has been a lot of talk about AI factories and I really want to talk about the impact on telcos here, but can we first of all be absolutely clear by what we mean by AI factories? Can we define what these facilities are and perhaps how they are different from public cloud computing service providers or platforms and service providers? How are they different? And we've heard already from all of you some aspects of this, but Susan, perhaps I could come to you and try and encapsulate what AI factories actually represent.

Susan James, DSP Councillor (27:48):
I think we touched on it before. When you're training an AI model or when you're developing an AI application, it's going to require GPUs. These processes obviously have a lot higher capabilities than your typical cpu. And as I said before, they use a lot more power. They need to dissipate a lot more heat eat, of course you have huge data that you need to train these models on. So your storage requirements are enormous. And to be able to do that processing fast, obviously your networking needs to be fantastic. So the way you design your data centers is different. You're going to have a lot more compute density and you're going to have a lot higher demands on your networking and storage. So you're going to need to design a data center that has much more power coming into it that has much a higher capacity from a networking perspective, and you're going to need to be able to call that efficiently.

(28:56):
So those requirements and the tools that you need for developing this all need to be there. So I think this is not something that you're going to put into an existing data center. So they're going to be custom built for these type of applications and you see them being built in various locations. There's a number that have been announced in the us. There's a number that have been announced in Europe in recent years. Even the EU has put huge funding in place for building high compute capability data centers. So we're going to see these ones that are very targeted at specific specialized workloads. So we have that as the development phase and then you start moving them out once you've developed into deployment and runtime. And again, then they can run in other places. But these GPUs are hugely expensive. I mean, there's a reason now NVIDIA is one of the most valuable companies in the world.

(29:54):
They are very expensive and very few companies can afford to actually buy these GPUs to have them access to them all the time. So they're going to be a shared infrastructure as well. So I think you will see all the cloud providers being able to provide these capabilities going forward because that's a logical extension of their business, but it's going to be different from the runtime environment you would typically deploy your typical cloud workload on. So that's how I see these AI factories being built. You're going to use them while you're developing your applications. Once you've trained your models, you're going to deploy it and get off them as quickly as possible.

Guy Daniels, TelecomTV (30:31):
That's really helpful. Thanks very much Susan Francis, we'll come across to you as well for your comments on AI factories.

Francis Haysom, Appledore Research (30:39):
Yes, a very short comment. I think at the end of the day you specifically mentioned the past providers and the public cloud providers from an AI factory point of view, they are already, they are the key drivers in this area. So in some senses they will be very different data centers, but they are still fundamentally data centers that need, and I'm raising the point I made earlier, which is it's all about operational scale in this game, operational scale from a point of energy usage and operational scale from a point of view of operational processes. So it is the PAs players, it is the public cloud providers that are doing this.

Guy Daniels, TelecomTV (31:25):
Great. Thanks for that clarification, Francis. That's also really useful to remember. And Susan, let's pop back to you.

Susan James, DSP Councillor (31:32):
There was one other thing I wanted to add there is that there is new EU legislation that's come in that starts to talk about heat recapture. So for data centers over a certain power consumption, you now need to start measuring your heat dissipation and over a period of time you're going to actually need to start doing heat recapturing. So we talked about the PUE, all of that energy that's going into the data you're going to need to start capturing and being able to reuse that. And I think that's why we're going to see more and more of these data centers being built because trying to retrofit existing data centers to do heat recapture I think is going to be extremely challenging. So I think that's one thing I'm really optimistic about from an environmental perspective going forward is that as much as we are pouring more and more energy into these data centers, we're going to need to start working much harder on actually recapturing and being able to repurpose that energy either to other things or back into the data center going forward.

Guy Daniels, TelecomTV (32:33):
Thank you very much, Susan. So we now know what AI factories are, how can they affect telcos? How can telco's benefit or take advantage of this emergence of this new type of data center, especially in terms of server and compute infrastructure? Diego, any thoughts about how we can perhaps use AI factories? We're not building them ourselves necessarily, but how can we adopt them?

Diego Lopez, Telefonica & ETSI (33:05):
Well, when it comes to AI factories, and I mean the big huge data centers with GPO, current GPO capacities and all the like, well, it's true what Francis was mentioning, that the companies that are best positioned for that, the carbon extra scalers, they have the footprint, they have the experience with operating this kind of infrastructures, et cetera. I frankly for telcos at this stage enter that race is going to be, well, it's going to be complex. Frankly, what we can try to do is, as I said, is play the game of connectivity of the distributed access or the distributed integration duration of local capacities with these highly centralized ones is important. That latency, I think it was Susan who was mentioning the relevance of the, because the data flows are very latency dependent when we're talking about this is about the time in which you are taking the decision, especially when you're using AI for something that is not an interaction with a humans.

(34:17):
We humans can wait for a few seconds in a conversation with a machine or the person is something that is well more or less acceptable. But if we are, you are putting the AI in control on something that requires a moderates attention, that latency becomes very close, becomes very, very important. What I believe is that again the same that in the past we witnessed the infancy of many other things. We are now witnessing the infancy of AI applications. They are very impressive. They are right now extremely. I mean when you look at change, some requests and the responses from the archetypal models et cetera is something that is very impressive. But it's something that, well, I believe that just to simplify the homework of teenagers spending the huge amounts of energy and resources that we're spending is something that personally I don't think is going to be sustainable in many different senses.

(35:33):
Not only in terms of energy consumption that is not very sustainable. And I do believe that we would need this new paradigm in which you can use this DDI smaller processing units and that will come among other things with enhancements, the algorithms and enhancements on the physical support for those algorithms. We are witnessing the late infancy probably of these technologies and they have to become more mature able to run on hardware that these more hardware mechanisms that are less energy and raw power consumption. And this is something that we have witnessed in the past, for example, with microprocessors and with programming languages, et cetera. I'm confident that that will be the case and in that moment is when the moment that you can start to distribute things and put the capacity the closest to the place in which it needs to be applied, then we'll have that opportunity for having a hybrid infrastructure that, I'm not saying that telcos are going to own everything, everything that is sustaining this coming distributed ai, but for sure connectivity itself and a new model of connectivity as we were discussing at the beginning will be essential for this.

Guy Daniels, TelecomTV (37:03):
Yeah, very interesting Diego. And as you say, we're in the infancy of this technology, Susan, lemme come across to you for your thoughts.

Susan James, DSP Councillor (37:10):
So think when you look at where the revenues are going to come from, a lot of it is going to be coming from businesses. How does AI help you make better decisions? And as Diego pointed out, the humans don't need to have that instantaneously. They don't need those low latency applications. But in, when you're deploying factories, when you are making decisions that require precise timing with low latency, all this, where I think the opportunity is, is the telcos build networks fantastically well. And when you look at the challenges facing most businesses, particularly small to medium businesses, this is not something that they do well. The challenges that they have. The number one challenge is network reliability. So where can the telcos play a role? If AI is going to transform the actual factories where things are actually built or manufactured? It's the networks that have to work perfectly for those AI models or implementations to be able to make a decision and have it react instantaneously so that you can stop the milling or fix the vibration in a milling tip or something to make those decisions better.

(38:33):
That's where I think the telcos play an absolute crucial role because it's just by far too hard for enterprises to be able to A, select the network, B, build a network, and C, manage that network reliably. That is one area that the telcos can play a significant role in. And I think instead of selling directly to the enterprises, it is going to be an ecosystem of players working with, whether it be the applications or whether it be working with the AI companies that say, if you need this to be a success and you have a dependency on a network, how do we partner up and do this in a better way so that you can deliver an end-to-end product and we can actually provide what we do well to your customers. So I think there is a opportunity there rather than actually trying to sell connectivity for connectivity sake. I think that is so hard and we just haven't managed to do that well in the last 20 years.

Guy Daniels, TelecomTV (39:36):
Susan, I think that is a great place for us to end today's discussion, although I'm sure we will continue this debate during our live q and a show later. For now, though, to all three of you, thank you so much for taking part in this discussion today. And if you are watching this on day two of our NextGen Telco infra summit, then please send us your questions. I'm sure you've heard some interesting views there and you're anxious to ask us questions. Well, we'll try and answer as many of them as we can in our live q and a show, which starts at 4:00 PM UK time. And the full schedule of programs and speakers can be found as usual on the telecom TV website, which is where you will also find the q and a form and our poll question For now, thank you so much for watching and goodbye.

Please note that video transcripts are provided for reference only – content may vary from the published video or contain inaccuracies.

Panel Discussion

As AI use surges, it presents both challenges and opportunities for telcos. How can they capitalise on this demand by leveraging their infrastructure, particularly edge architecture, to benefit commercially? And how might the emergence of AI factories – essentially GPU-centric datacentres – affect telcos, especially in terms of server and compute infrastructure, and are there synergies and benefits from co-locating 5G workloads?

Recorded November 2024

Diego R. Lopez

Senior Technology Expert, Telefónica, and ETSI Fellow

Francis Haysom

Principal Analyst, Appledore Research

Susan James

DSP Leaders Councillor