SK Telecom preps telco LLM launch in June

Eric Davis, head of AI Tech Collaboration at SK Telecom, presents the Telco LLM model during a press briefing held on 30 April.

Eric Davis, head of AI Tech Collaboration at SK Telecom, presents the Telco LLM model during a press briefing held on 30 April.

  • South Korean telco has been developing large language models (LLMs) specific to telecom operator use cases
  • It has implemented its multi-LLM strategy using its own technology as well as LLM platforms from partners Anthropic and OpenAI
  • SK Telecom has also developed an ‘intelligence platform’ for telco generative AI (genAI) application development

SK Telecom believes it will have its first telco-specific large language model (LLM) ready for action by June, and has developed what it calls an ‘intelligence platform’ that can be used for the development of generative AI (GenAI) applications for the telecom sector. 

The South Korean operator has been working on what it calls its “multi-LLM strategy” since last summer to develop generative AI systems that are fine-tuned for specific purposes and can be used by different teams within a telco: The operator’s chief AI global officer, Suk Geun Chung, spoke with TelecomTV at Mobile World Congress (MWC) 2024 in Barcelona about the development and other AI matters. 

The operator, which has been positioning itself as an AI company since 2022, has been training its LLMs with data relevant to the South Korean telecom sector, including terminology, rate plans and contracts, subsidies and offers and AI ethics: For this process it has been using its own LLM, dubbed A.X, as well as OpenAI’s GPT-4 platform and the Claude LLM from Anthropic (in which SK Telecom has invested). 

To train the LLMs, SKT has been collecting telecom data, and selecting and purifying unstructured and structured data, which is put through a “human feedback-based reinforcement learning (RLHF)” cycle and then evaluated through a final benchmarking process, it noted in this announcement (in Korean). 

Eric Davis (pictured above), who is head of AI tech collaboration at SK Telecom, noted that the AI needs of telcos can’t truly be met with general purpose LLMs and that “fine-tuning and model evaluation that adjusts to telecommunication data and domain know-how” is needed. “SKT’s unique multi-LLM strategy is to create various telco LLMs through (benchmarking) and allowing them to be selected and used according to the situation,” he explained. 

The operator believes it will achieve significant efficiency gains in its customer care operations. Currently, it takes about three minutes for a customer service team member (counsellor) to handle one consultation call and more than 30 seconds to process business after the consultation. “However, with the introduction of Telco LLM, the LLM will provide solutions to the counsellor while the counsellor is on the phone with the customer. It is expected that it will be able to greatly shorten the time it takes to process the consultation,” noted SKT.  

SKT also believes network operational efficiencies can be gained from using the Telco LLM. “If an infrastructure operator encounters a problem while monitoring the network, he or she can input a question into the Telco LLM in real time and receive a solution in response,” noted the operator. 

And there are broader applications too. Jeong Min-young, who runs SKT's AI platform, noted: “Telco LLM will increase work efficiency in various areas of telecommunication company operations, not only customer centres and infrastructure, but also customer contact points, such as marketing/distribution networks, and in-house work, such as legal affairs and HR.” 

SKT also unveiled an ‘intelligence platform’ that will enable telcos, and other types of companies with “similar business characteristics”, to “efficiently build and develop generative AI applications”. The operator is already using the platform for the enhancement of its A. GenAI application, which interacts with its customers. 

The company says the platform is an “enterprise AI development and operation package” that offers a broad range of functions including multi-LLM support, multimodal functions (where a model is not limited to just one input or output mode) and everything from multi-LLM to multi-modal, orchestration, and retrieval augmented generation (RAG), which enables GenAI models to retrieve information from external sources.

- Ray Le Maistre, Editorial Director, TelecomTV

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.