To embed our video on your website copy and paste the code below:
<iframe src="https://www.youtube.com/embed/WRrBTri7oOM?modestbranding=1&rel=0" width="970" height="546" frameborder="0" scrolling="auto" allowfullscreen></iframe>
Patrick Kelly, Appledore Research (00:20):
Gen AI is mostly a sandbox opportunity, which I think is fine for reskilling service provider teams. I think the biggest challenge today, right now that could derail AI investments is building a credible business case that has some hard metrics around it. So if you look at the infrastructure and chip set, there's hundreds of billions of dollars being invested there. I think operators need to focus on strong business cases to drive revenue and productivity improvements.
Ronnie Vasishta, NVIDIA (00:56):
So I'd say one of the first challenges we're seeing with telcos and AI is just the experience level of some of the telcos that have how much experience they have within their organizations for ai. And so we're trying to augment that with our ecosystem as well as our developers. Also, of course, there's legacy systems. When you talk about radio access networks, for instance, you need to move to more software defined and AI compute accelerated compute capabilities. So I would say experience and legacy systems are perhaps the biggest challenges we're seeing.
Alex Choi, AI RAN Alliance & SoftBank (01:39):
The availability of training data sets in the telecom space, acquiring and preparing these data sets can be complex time consuming process, particularly when dealing with the diverse and distributed network environment. Another significant challenge is the skill gap within the workforce. Many operators find it very difficult to bridge this gap. Lastly, the integration of ML ops, AI ops, even foundation model ops, LLM ops, integrating all this ops system into the existing telecom IT infrastructure is going to be extremely challenging. Teleco systems are often legacy heavy and highly customized, making it difficult to seamlessly incorporate AI driven operations.
Phil Cutrone, HPE (02:38):
The first challenge is really getting planning and strategy aligned to the three key areas, and that's the first one is cost reduction. So what is the next evolution of cost opportunities to continue to monetize their network. The second is feature enhancements. Our customers and consumers are going to expect continued improvements in feature sets, and I think each of the operator will have to keep up. So we'll need a plan that maybe that's call trends coding or call translation, and then finally generating new revenue streams. So each one of those need a plan. The second one and a fairly big one, and that is sustainability. ai, if you've been reading, takes a lot of power to power and cool the system. So I think the sustainability and power consumption is going to be the second one. The third one's investment. Just like with anything else, the operators would have to deploy some type of a capital investment or partner with other companies that joint invest. I think that especially when we're thinking about inferencing at the edge, there's capital infrastructure that would need to be deployed and A HPE has been involved in trying to make this a lot easier by creating a very large ecosystem of partners from ISV partners, hardware partners, especially on the acceleration side for GPUs, and then even the traditional NS that are doing great work in all of their applications. So we built this great ecosystem and we even now are offering predefined pre-validated systems to get the operators up and started quickly.
Gen AI is mostly a sandbox opportunity, which I think is fine for reskilling service provider teams. I think the biggest challenge today, right now that could derail AI investments is building a credible business case that has some hard metrics around it. So if you look at the infrastructure and chip set, there's hundreds of billions of dollars being invested there. I think operators need to focus on strong business cases to drive revenue and productivity improvements.
Ronnie Vasishta, NVIDIA (00:56):
So I'd say one of the first challenges we're seeing with telcos and AI is just the experience level of some of the telcos that have how much experience they have within their organizations for ai. And so we're trying to augment that with our ecosystem as well as our developers. Also, of course, there's legacy systems. When you talk about radio access networks, for instance, you need to move to more software defined and AI compute accelerated compute capabilities. So I would say experience and legacy systems are perhaps the biggest challenges we're seeing.
Alex Choi, AI RAN Alliance & SoftBank (01:39):
The availability of training data sets in the telecom space, acquiring and preparing these data sets can be complex time consuming process, particularly when dealing with the diverse and distributed network environment. Another significant challenge is the skill gap within the workforce. Many operators find it very difficult to bridge this gap. Lastly, the integration of ML ops, AI ops, even foundation model ops, LLM ops, integrating all this ops system into the existing telecom IT infrastructure is going to be extremely challenging. Teleco systems are often legacy heavy and highly customized, making it difficult to seamlessly incorporate AI driven operations.
Phil Cutrone, HPE (02:38):
The first challenge is really getting planning and strategy aligned to the three key areas, and that's the first one is cost reduction. So what is the next evolution of cost opportunities to continue to monetize their network. The second is feature enhancements. Our customers and consumers are going to expect continued improvements in feature sets, and I think each of the operator will have to keep up. So we'll need a plan that maybe that's call trends coding or call translation, and then finally generating new revenue streams. So each one of those need a plan. The second one and a fairly big one, and that is sustainability. ai, if you've been reading, takes a lot of power to power and cool the system. So I think the sustainability and power consumption is going to be the second one. The third one's investment. Just like with anything else, the operators would have to deploy some type of a capital investment or partner with other companies that joint invest. I think that especially when we're thinking about inferencing at the edge, there's capital infrastructure that would need to be deployed and A HPE has been involved in trying to make this a lot easier by creating a very large ecosystem of partners from ISV partners, hardware partners, especially on the acceleration side for GPUs, and then even the traditional NS that are doing great work in all of their applications. So we built this great ecosystem and we even now are offering predefined pre-validated systems to get the operators up and started quickly.
Please note that video transcripts are provided for reference only – content may vary from the published video or contain inaccuracies.
4 in 4: AI for Telco - Episode 4
During the final episode of our 4 in 4 'AI for Telco' series, executives from SoftBank, Hewlett Packard Enterprise, NVIDIA and Appledore Research discuss some of the challenges telcos might face when deploying AI.
Featuring:
- Alex Jinsung Choi, Principal Fellow, Research Institute of Advanced Technology, SoftBank Corp., Chair of the AI-RAN Alliance
- Patrick Kelly, Founder, Partner, and Principal Analyst, Appledore Research
- Phil Cutrone, SVP & GM, Service Providers, Telco, OEM, Hewlett Packard Enterprise
- Ronnie Vasishta, Senior Vice President, Telecom, NVIDIA
Recorded October 2024
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.