Demand for Network Edge AI hardware to increase as machine learning matures

Guy Daniels
By Guy Daniels

May 29, 2018

© Flickr/cc-licence/ManyWonderfulArtists

© Flickr/cc-licence/ManyWonderfulArtists

  • Artificial Intelligence to move out of the cloud and on to the edge
  • Machine Learning will be the initial focus of the move
  • Multiple verticals ripe for adoption of AI
  • Hardware needs to scale to make it cost-effective

Further evidence that you can’t treat each new technology sector in isolation. We already have the migration of cloud technologies to the network edge and telco central offices, and we are likely to see Edge Computing become be a key enabler for 5G. Now we are hearing more about the role of Artificial Intelligence at the network edge, as the realisation dawns that we cannot architect every process into remote edge devices and come up with the necessary low latency services – we need automation and we need AI.

A new report from ABI Research picks up on the role of “inference” (machine learning) in the move to take AI out of the cloud and incorporate it into the edge. It calculates that by 2023 the market will see some 1.2 billion shipments of devices capable of on-device AI inference – up from just 79 million in 2017. It believes that power-efficient chipsets are the main driver of edge AI, and advises vendors to start thinking about new supporting business models, such as end-to-end integration or chipset as a service.

Original Press Release:

Hardware Vendors Will Win Big in Meeting the Demand For Edge AI Hardware

London, United Kingdom - 29 May 2018

Artificial Intelligence (AI) will see a significant shift out of the cloud and on to the edge (aka on-device, gateway, and on-premise server). This will happen initially in terms of inference (machine learning) and then by training. This shift means a huge opportunity for those chipset vendors with power-efficient chipsets and other products that can meet the demand for edge AI computing. Edge AI inference will grow from just 6% in 2017 to 43% in 2023, announced ABI Research, a market-foresight advisory firm providing strategic guidance on the most compelling transformative technologies.

“The shift to the edge for AI processing will be driven by cheaper edge hardware, mission-critical applications, a lack of reliable and cost-effective connectivity options, and a desire to avoid expensive cloud implementation. Consumer electronics, automotive, and machine vision vendors will play an initial critical role in driving the market for edge AI hardware. Scaling said hardware to a point where it becomes cost effective will enable a greater number of verticals to begin moving processing out of the cloud and on to the edge,” says Jack Vernon, Industry Analyst at ABI Research.

ABI Research has identified 11 verticals ripe for the adoption of AI, including automotive, mobile devices, wearables, smart home, robotics, small unmanned aerial vehicles, smart manufacturing, smart retail, smart video, smart building, and oil and gas sectors and split across a further 58 use cases. By 2023 the market will witness 1.2 billion shipments of devices capable of on-device AI inference – up from 79 million in 2017.

Cloud providers will still play a pivotal role, particularly when it comes to AI training. Out of the 3 billion AI device shipments that will take place in 2023, over 2.2 billion will rely on cloud service providers for AI training - this is still a real-term decline in the cloud providers market share for AI training, which currently stands around 99%, but will fall to 76% by 2023. Hardware providers should not be too concerned about this shift away from the cloud, as AI training is likely to be supported by the same hardware, only at the edge, either on-premise servers or gateway systems.

The power-efficient chipset is the main driver of edge AI. Mobile vendor Huawei is already introducing on-device AI training for battery power management in its P20 pro handset, in partnership with Cambricon Technologies. Chip vendors NVIDIA, Intel, and Qualcomm are also making a push to deliver the hardware that will enable automotive OEMs to experiment with on-device AI training to support their efforts in autonomous driving. Training at the edge on-device is beginning to gain momentum in terms of R&D, but it could still take some take some time for it to be a realist approach in most segments.

“The massive growth in devices using AI is positive for all players in the ecosystem concerned, but critically those players enabling AI at the edge are going to see an increase in demand that the industry to date has overlooked. Vendors can no longer go on ignoring the potential of AI at the edge. As the market momentum continues to swing toward ultra-low latency and more robust analytics, end users must start to incorporate edge AI in their roadmap. They need to start thinking about new business models like end-to-end integration or chipset as a service,” Vernon concludes.

These findings are from ABI Research’s Artificial Intelligence and Machine Learning market data. This report is part of the company’s AI and Machine Learning research service, which includes research, data, and Executive Foresights.


This content extract was originally sourced from an external website (ABI Research Media Releases) and is the copyright of the external website owner. TelecomTV is not responsible for the content of external websites. Legal Notices

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.