Supporting AI factories and distributed inference

Panel Discussion

GenAI is scaling faster than any previous workload. From 5 megawatt “GPU blocks” to giga-campuses, AI factories are popping up everywhere – but inference must also move closer to end users to beat latency budgets and comply with a patchwork of data-sovereignty laws. Telecom operators already own the world’s most ubiquitous edge real estate, long-haul fibre and resilient power footprints. This discussion explores how telcos, colocation players and equipment vendors can collaborate to create AI-optimised transport fabrics, edge nodes and cooling-constrained points of presence (POPs) that turn connectivity into a new service opportunity.

Featuring:

  • Andy Linham, Principal Strategy Manager, Vodafone Group
  • Beth Cohen, Product Strategy Consultant, Verizon
  • Kerem Arsal, Senior Principal Analyst, Omdia

Recorded October 2025

Participants

Andy Linham

Principal Strategy Manager, Vodafone Group

Beth Cohen

Product Strategy Consultant, Verizon

Kerem Arsal

Senior Principal Analyst, Omdia