SoftBank pitches AI-RAN for robot deployments

  • Softbank and Yaskawa Electric are collaborating on the social implementation of a new type of AI by exploiting AI-RAN
  • They will carry out integration analysis of different types of data in real time to offer robots capable of multiskilled functionality in multiple roles
  • Data from sensors, cameras and external systems is evaluated through ‘AI-driven decisions’ 
  • Robots will take advanced actions based on situational judgements

Another day, another industry collaboration. This time it’s Japan’s Softbank Corp. and Yaskawa Electric Corp. agreeing to work together on what they describe as the social implementation of ‘physical AI’ utilising AI-RAN

The two companies are jointly developing use cases for “office-oriented physical AI robots powered by AI running on MEC [multi-access edge computing]”. They will deploy AI-powered robots to assist in navigating office buildings and other environments where large numbers of people work together. This so-called physical AI will undertake complex tasks in “fluid environments” (but presumably not under water). 

MEC, as a reminder, is an ETSI-defined network architecture specifically to enable cloud computing and an IT service environment at the edge of a communications network.

In essence, the MEC runs applications and related processing tasks closer to the mobile subscriber. The result is that network congestion is reduced and applications perform better. MEC is implemented at cellular base stations or other edge nodes and so permits flexible and rapid deployment of new applications and services. Combining elements of IT and telecoms networking, MEC further allows cellular operators to open their radio access network (RAN) to authorised third parties, such as application developers and content providers.

Initially, SoftBank and Yaskawa Electric have together come up with a use case for an “office-oriented” physical AI robot that integrates with building management systems and utilises AI running on MEC. Conventional robots are built to undertake specific tasks and are incapable of handling multiple tasks simultaneously. However, with AI operating on MEC, it becomes possible to integrate and analyse different types of data in real time, accurately assess what is happening in dynamic situations and thus provide ‘optimal instructions’ to robots. The result? Multiskilled functionality that enables a single robotic unit to take on multiple roles.

The memorandum of understanding (MoU) that SoftBank and Yaskawa Electric have signed commits both parties to partner on “the social implementation of ‘physical AI’ by exploiting SoftBank’s AI-RAN initiative and Yaskawa Electric’s AI robotics”. (SoftBank is, of course, one of the founders of the AI-RAN Alliance.)

In this context, physical AI is a technology that allows robots to analyse and then interpret data from sensors, cameras and external systems through AI, allowing them to perform flexible and complex physical movements based on AI-driven decisions. 

A demonstration of the so-far unnamed robot will take place at this week’s 2025 International Robot Exhibition (iREX 2025) in Tokyo.

Japan’s aging demographic 

With a declining birthrate and an aging population causing labour shortages across Japanese industries, there is a major focus on automation, AI and robotics as, at least, a partial solution to a pressing national problem. What’s more, the solution proposed by the two companies will see new technologies in physical AI, which is essentially a branch of AI that enables machines to perceive, reason and act in the physical world by directly using data from sensors and actuators.

Unlike ordinary digital AI, it bridges the gap between digital and physical environments to allow systems, such as robots, autonomous vehicles and smart devices, to learn, adapt and interact with their surroundings in real time. Together, the two companies will build a domestically developed physical AI infrastructure for deployment within Japan. Time will tell if it will be attractive and have application outside its home country.

Phase 1 of their collaboration has seen SoftBank and Yaskawa Electric jointly developing the above use case for an office-oriented physical AI robot. It goes well beyond conventional automation and digitalisation frameworks by connecting Yaskawa Electric’s high-performance robots with SoftBank’s AI-RAN-based MEC, the AI operating on it, and next-generation building management systems.

It enables robots to perform advanced actions based on situational judgements that take into account conditions within the building, “such as identifying and retrieving a specific smartphone from an office shelf, and to respond flexibly to unforeseen events”. 

To verify the use case, and for demonstration purposes, Softbank and Yaskawa Electric have constructed a virtual next-generation building management system that works together with AI running on MEC ("MEC AI"), which generates task instructions for robots, and AI that controls them. MEC AI centrally manages information, such as building facilities, office supply inventories and the operational status of robots, and instructs the robots what tasks to undertake by referring to data from the building management system and other sources, including sensor and camera information. The robot AI co-ordinates with MEC AI to generate specific robotic actions and followup. 

Both SoftBank and Yakawa Electric have different and separate responsibilities within the framework of the MoU. For example, SoftBank provides the MEC environment and has developed its ‘ vision-language model’ (VLM), an AI that functions as MEC AI and generates tasks based on sensor data and external information. Meanwhile, Yaskawa Electric provides the robot and has developed its ‘vision-language action’ (VLA), an AI that functions as robot AI and generates robot actions based on instructions from VLM.

– Martyn Warwick, Editor in Chief, TelecomTV

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.