Red Hat offers cloud smarts to AI-native telcos

To embed our video on your website copy and paste the code below:

<iframe src="https://www.youtube.com/embed/5IkyhfHSXRc?modestbranding=1&rel=0" width="970" height="546" frameborder="0" scrolling="auto" allowfullscreen></iframe>
Ray Le Maistre, TelecomTV (00:12):
Artificial intelligence is not new as we all know, but it certainly has a new lease of life and that's impacting developments and strategies across all industry verticals. And in the telecom sector, there's a major push amongst the network operator community to become AI native as soon as possible. Well, to find out more about this trend, I'm talking today with Hanen Garcia Global Telco Solutions manager at Red Hat. Hanen, good to see you again. Always a pleasure to chat to you. So tell me, how is Red Hat supporting telecom companies in their journey towards becoming AI native and can you provide some examples of specific use cases?

Hanen Garcia, Red Hat (00:56):
Well, Ray, that's a question I get quite frequently now when every discussion I have we're having with customers is there is the AI topic coming up and I will give you one word is a scale. At Red Hat we're supporting the telecommunications companies in becoming what we can call AI native by enable them to scale the AI and ML strategies that they're implementing and that goes with manage the data efficiently or automate all the processes around ai. It is one thing when you have a small team of AI/ML data scientists creating one service, is another when you have 10 of hundred of services using the AI technologies, especially on telcos, when you have all these different silos across the organization and AI is a very complex technology and all aspects from data governance model training and the lifecycle management of all of that is not an easy task and rehab.

(02:09):
What we are trying to do is we're trying to make that complexity easier for the customers so that they can focus on actually the task and the actual use case and the actual implementation that they want to use AI for to make it viable to drive new revenue stream with cognitive services, improve customer satisfaction to reduce churn, improve operational efficiency to reduce costs. We hear a lot about that and many other things. Give you another example. Detect anomalies on the signaling plane to avoid cold drops. So we have this work going with a customer where they're actually trying to see, we can build models or help them predict anomalies that will cause cold drops that end up in some cases into heavy fines for the operators as they have to respect SLAs.

Ray Le Maistre, TelecomTV (03:14):
Yeah, no, this is all important stuff and there's so much work being done to advance a lot of these very useful use cases for the operators. Now as AI continues to transform telecom operations, what role does Red Hat's open source approach play in enabling innovation and scalability within the AI native ecosystem?

Hanen Garcia, Red Hat (03:39):
Well, when it's come to ai, we are heavily contributing to open source. Probably you are aware of this and we had and IBM, we have open source, actually the granite family of models to help enterprise build AI solution based on a safe foundation. And this is very important, I want to highlight that because when we're seeing many customers that have started well as advancing in using models out there that they probably don't know exactly what the sources of how the models were trained and that increased the risk on what the result could be for specific cases that they haven't not test before launching the services. Right? So this is where we have put it help and contributing in the community as well, is that creating those models that provide that safe foundation, understanding how the models are trained with data and provide transparency when it's come to using AI models.

(05:02):
And we have as well created these in insert lab open source community where those open source model can be expanded to include knowledge and skill from specific areas of the business. It could be from the radio access network or the core network or a specific business like roaming services or simply answering question to the customers on a specific services that they want to hire from the operator. And with the instruct lab we have, as we are using as well what we call the synthetic data generation to make the tuning of the model a task that most of the people could execute on. We have a great demonstration that we had given at Rah hat summit of how all this technology works and it's quite impressive how basically somebody in an organization can use these processes and these models to actually create a service or expand a service that they're already offering to the customers. So the instructor lab, open source community is all about giving control and transparency to what data we using to train the models and how the models are trained and everybody can contribute into that. So as well, we have enterprise grade products like rail AI and OpenShift AI that we're putting all that open source innovations in the hands of the customers and together with our ecosystem, of course we have partners where we are actually designing and building AI a specific solution for the telco industry.

Ray Le Maistre, TelecomTV (06:48):
Now you mentioned earlier on the complexity associated with AI deployments. How is Red Hat helping telecom operators to address the challenges of AI integration such as data management, real-time processing and ensuring compliance in highly regulated environments?

Hanen Garcia, Red Hat (07:08):
Well, that's a very good question, Ray. You mentioned data management and data is centric to ai, right? Everything, ai, ml, every approach, every solution will be driven by the data. So we have been working with our ecosystem to make sure that we have the right solutions to help our customers in the process of data ingestion and data processing itself. Of course there is not just that, but the integration with all systems surrounding the infrastructure and looking into aspect of how we can capture the data and how we can transport the data to the systems that will be in charge of actually processing it and training the model with that data. And it's very important. It's very important. Most of the operators nowadays have already in place a strategy for data management and we are thinking with them is how we incorporate that strategy into their AI strategy as well.

(08:13):
You mentioned other important element that always come to processing and real time processing. So we are working with partners like Nvidia, inter Arm and A MD that are creating the technology that is used to actually process that data to train the model, to tune the models and sometimes as well to serve the model when we are looking at scenarios for inference, and this is very important that work that we're doing so that all the capabilities on that hardware is on our platform so that people that is going to be creating the models, tuning the model, doing the data processing can leverage all those capabilities in the hardware. And this is similar to what we have doing before with NFB when we were looking at a scenario on data plane acceleration. But for AI is a little bit more new on one sense. And the other sense is something that we have been doing for a long time as well.

(09:19):
So it's new in the sense of there is new hardware capabilities that are coming every single day. And on the other side is something that we have been doing because we have all this experience coming from where we have been doing with the telco before. So GPUs acceleration or DPU acceleration is something that is not new to us and we have been working for a long time with vendors on that and bringing those capabilities now into the AI layer. This is what has been our task lately. You mentioned as well another topic that is regarding compliance and regulation. And this is very specific for the telecom operators that are heavy regulated when it's come to the data and the subscriber data and not just the subscriber data but usage of the subscribers as well. Right, and this is something that we have been working for a long time.

(10:16):
We are making sure that our platform provide all securities requirements that they might have with regard to the specific regulation, but as well when it's come to making sure that we can isolate and work well between the different environments, operators will be using infrastructure that is on their premises for data from AI and ml, but they will use as well services from Hyperscalers and with Red Hat for example, OpenShift and our hybrid cloud approach. This is something that we are as well bringing into the ai, making sure that operators can leverage those capacities, those capabilities on the different environment for the tasks that they want to do for the solution that they want to do. Keeping all the environment in compliance with the internal rules and regulations.

Ray Le Maistre, TelecomTV (11:12):
Now of course, every company is trying to get a piece of the action as the AI market grows. So what unique advantages does Red Hat's hybrid cloud infrastructure offer to telecom companies that are looking to enhance their AI capabilities?

Hanen Garcia, Red Hat (11:30):
Well, you give away the answer. I think one of the key milestones or the key elements of fact is our hybrid cloud approach. I mentioned before that having a platform or having a framework that allows them to use the capabilities and where they are is something that is key. And this is something that we're bringing with OpenShift and OpenShift AI in this case, but as well everything that it's come, and I mentioned this before it's come around the modeling. So we have now provide the capabilities in the sense of open source models in stroke lab community where you can have now the capability to train and tune the model according to what you need from an open source model where you control and have the transparency. With OpenShift ai, you have all the capabilities to manage the lifecycle that has come from that ingestion, data processing, the model training and the model serving.

(12:45):
All of that capabilities are there to provide our customer with a trusted foundation, a trusted platform for their AR solution and provide them choice as well as we work with many of the AI native partners in the ecosystem. I mentioned a few of them. So we are trying to make and give the telco organization the scale to bring those solutions to their customers, AI based solution to the customers in a safe and a trusted manner. And at the same time, they can leverage everything that they have and everything that's available to them to achieve that target as well. So we are bringing a lot of capabilities into the platform and we're putting those capabilities in the hand of customers already when it's come to creating AI services.

Ray Le Maistre, TelecomTV (13:49):
Okay, great. Well, fantastic insights there Hannan. Thanks very much for joining us today and look forward to chatting with you again soon.

Hanen Garcia, Red Hat (13:59):
Thank you very much for having me Ray.

Please note that video transcripts are provided for reference only – content may vary from the published video or contain inaccuracies.

Hanen Garcia, Global Telco Solutions Manager, Red Hat

Telcos with AI-native aspirations face multiple challenges to make the most of the myriad of artificial intelligence opportunities and Red Hat, with its deep hybrid cloud and open-source experience, is primed to help with data management, compliance, scalability and much more, explains Hanen Garcia, global telco solutions manager at Red Hat.

Recorded September 2024

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.