How APIs enable Colt’s network on demand

To embed our video on your website copy and paste the code below:

<iframe src="https://www.youtube.com/embed/0DUdgoIYH-Y?modestbranding=1&rel=0" width="970" height="546" frameborder="0" scrolling="auto" allowfullscreen></iframe>
Ray Le Maistre, TelecomTV (00:04):
We're at Future Networld in London. I'm with Fahim Sabir, he's director of Network on Demand at Colt. Fahim, great to see you again. Thanks very much for taking the time to talk to us. Now, in general, in the industry network APIs and the exposure of network capabilities has been a really big topic in the past year or so. Where is Colt in its network? API journey? I mean, this is something you've been working on for a while, isn't it?

Fahim Sabir, Colt Technology Services Ltd (00:30):
Yeah, I mean, we've been actively involved in this topic. So a number of years ago, we actually started the initiative with at and t and Orange in meth. And since then we've been going through the whole exercise of defining the APIs, developing the APIs, and now starting to onboard customers onto those APIs. And it's been a real success. I think the work that's been done there has been really, really important, and it seems like they're now starting to get some traction and then a good number of organizations that are now starting to implement them,

Ray Le Maistre, TelecomTV (01:09):
Right? Yeah. But I'm sure it's not an easy process. You can't just go, oh, I'm going to expose some APIs now. It must take, there must be quite a big run up to this, quite a lot of moving parts as it were.

Fahim Sabir, Colt Technology Services Ltd (01:23):
Yeah, absolutely. So yeah, besides defining the APIs, it's all of the implementation, all of the mapping. I mean, in our model, what we've done is because we've built it on top of our network on demand platform, we've been trying to enable full end-to-end automation. So theoretically somebody can basically use the APIs and they can have a live service within five minutes from starting that interaction with us, which is it's great achievement. But I guess the biggest challenge that you have with the whole API conversation is more with the partner. So when you go to a partner, you can't just tell them, here's a bunch of APIs, please start using them because on their side, it's an IT project. It has to be budgeted, it has to be planned. Business cases have to be written. So the run up to actually getting an integration live is actually quite significant.

(02:13):
But what we've seen is standardization has definitely made a lot of that project easier, because better if they've done a standard based API before, it definitely shortens the amount of time more about, well, actually let's just point it and then test it and make sure it works and maybe do a couple of tweaks to it rather than a full-blown six months, a hundred thousand euro project, which is obviously a lot more difficult to justify, especially depending on the volumes, because in the past, all of the integration has been done based upon volume. So I would integrate with a supplier that I'd buy 10 circuits from a day, and I wouldn't integrate with a supply where I buy three circuits a year. But now that balance has starting to shift. So yeah, fascinating area.

Ray Le Maistre, TelecomTV (03:00):
Okay, so what is the end result for the customers down the line? I mean, does this just make their life a whole lot easier and it must improve the customer experience in many

Fahim Sabir, Colt Technology Services Ltd (03:14):
Ways? Yeah, I mean, that's the aim mean. Obviously there's two sides of this coin. So the operational cost is clearly one of the areas where we need to focus. We're in a market, which is the price erosion is a very, very big thing, but also from an end customer point of view, there is a massive difference in inexperience because there are no manual hands that these things have to go from. So they get an answer quickly in terms of a service that they want to buy, they can buy it straight away, and if there's automation within the partner themselves, it can get delivered in near real time. Yeah, I mean, it's great like

Ray Le Maistre, TelecomTV (03:52):
That. So you've mentioned automation there quite a few times, and that's a real big theme at this event, and this is something Colt's been working on for quite a few years, I know, and you started going down the cloud native route pretty early as well. Where are you in terms of automating your processes and to what extent have you been using AI applications to do that?

Fahim Sabir, Colt Technology Services Ltd (04:19):
Right, I mean, yeah. Okay. So a couple of questions. So in terms of automation, as you said, we've been down, we've been going down the automation journey for a number of years. We have introduced our network as a service platform on call on demand. Our SD-WAN offering also is a co-managed offering, so customers can go online and adjust their sd-wan themselves. And yeah, it's been an interesting activity, to be quite honest with you. I think generally speaking, compared to the market, I think we're a little bit ahead. We've gone through the challenges. We can see others that are going through the same challenges, the challenges like data quality and trying to deal with different generations of network infrastructure and the such. Yeah, I think that basically you have to plan for automation in order to make automation easy for yourselves. Otherwise you can't just take existing processes, existing products, et cetera, and just try to automate them because every bit of work that telcos do have been optimized for manual work.

(05:30):
Yeah, and obviously I'd like the other side of it is the cloud native, which is an interesting expression in itself. I'm not actually sure what cloud native really means. I think, yeah, it means different things to different people, but I mean in terms of actual virtualized capability, in terms of last year, towards the end of last year, we launched our on demand virtual router capability. So you can basically stand up an internet connection with a router, which is a Juniper router in the background running on an NFE stack. And it's fully orchestrated end to end. The aim is to now carry on that path and just basically introduce more and more functions that customers can implement in the core of the network so that we don't have to deliver appliances. We take away the cost of that, and not just that, but the customer gets it the moment that they want it and they can resize it to bigger ones, smaller ones without having to worry to in terms of space and power and all the other fun stuff that customers typically have to deal with when you deliver hardware appliances.

(06:38):
So yeah, we're moving in that direction in terms of our software platform, fully microservices, fully micro service based, completely deployed on Kubernetes and CICD, and basically all of the appropriate techniques and tools that have been used from a software development point of view. I think we've established that pretty well, and it's working wonderfully for us in terms of AI itself. AI is another one of those topics that's always really interesting. So cutting through the fluff between the AI washing if you like, and actual ai, whether it's good old AI versus generative ai, I think we're pretty early in that journey. We're trying to feel our way around it. We've done a couple of proof of concepts. So we did one proof of concept that was based around a network on demand platform whereby we would automatically scale bandwidth of services based upon previous customer behavior.

(07:38):
So if we realized, let's say for example, that customer decided to, they maximize their bandwidth users at 5:00 PM every day. So we would actually figure out they would do that, we would tell them and why it would actually update the bandwidth or increase the bandwidth that time and decrease it, let's say for example, eight o'clock where usage kind of tailed off. So we had a bit of a play with that. It worked pretty well. We also have looked at some sort of predictive fault detection in our network, so monitoring certain characteristics of the network, like abnormal power usage in core network devices and the such lights to try to predict when something was going to break so we could fix it before it broke rather than after it break. So those are the two use cases that we've experimented with so far. Neither of them are in production yet.

(08:31):
Yeah, I mean otherwise, from an organizational point of view, we're in the process of rolling out generative AI capability to the employees across cult. So to help with productivity and the sites and all of the work that we're doing with regards to any sort of systems updates in terms of a, we have initiative going on at the moment called cult io, which is all around a refresh of our core B-S-S-O-S-S platforms. And the AI requirement is absolutely core in those activities so that the systems are helping to manage tickets and manage accounts and provide the right sort of advice to the users that are using them to help them basically make sure that we're doing what we can to delight our customers. So that's what it's all about. So we've got an industry leading NPS score, and our mission is to not sit on our laurels in that respect and just continue to basically push the envelope in that respect. Yeah. Mean, but relatively early in the journey. Lots of interesting stuff to look at. Yeah,

Ray Le Maistre, TelecomTV (09:37):
Definitely. Now, of course, the other side of the picture here, which you were mentioning before when we were chatting before this interview, is that it's not just how cult is using ai. You've got your enterprise customers are using ai, and that in itself has an impact on you as the service provider and partner.

Fahim Sabir, Colt Technology Services Ltd (09:59):
Yeah, absolutely. I mean, I think over the last couple of years we've gone through a couple of generational shifts. So obviously the first one is the move to cloud computing basically changed the whole profile of how the network was being used and everything was basically then starting to go towards the cloud. Internet connectivity became more important as one of the primary mechanisms for accessing SaaS type services or even ias, PaaS type services as well. So we had that step. We had another step, which was the covid step whereby basically the office, the enterprise building became less important as a center of users. So customers were being distributed kind of tied into the emergence of SSE and the such technologies like that. And I think AI is the next generational shift. So it's kind of interesting. There's a number of questions that need to be asked.

(10:54):
So first of all, who is going to own the GPU file? I mean, that's the key one. Are we going to see the likes of the hyperscale cloud providers that exist in the market today? Is it going to be them? Is it going to be a certain manufacturer that basically builds GP? Are they going to become a cloud provider in themselves? Certainly it's something that could happen. So are we going to see a shift in terms of where the traffic is moving in that respect? Expectation is the fact that sensors from an iot point of view are going to become more important because now you can actually start to do something useful with the data that they actually produce, processing that data. So rather than you could end up in a situation whereby you've got different types of data all the way from occupancy data, which is just like a numeric feed that basically hits once every whatever minute.

(11:49):
It's just a one bite every minute. Or you could go for a full four video that's being pushed over towards the cloud and all of that data needs to be consumed. Will it get processed locally on the provider edge, in the provider core in the cloud? So you basically will likely have different steps where that data will actually get consumed. And also you can also think in terms of generation of content as well. You say if you look at the gen AI type stuff, generation of text, generation of video, generation of imagery, the customers going to want to, our users are going to want to download that stuff back to their machines. So you have that requirement as well. So I think what we're going to see is we're going to see quite a lot of change in terms of where the hotspots of traffic are going to move to because of the whole way that AI is being used to process data.

Ray Le Maistre, TelecomTV (12:46):
Yes. And have you had any customers ask about GPU as a service or something like that because that's starting to crop up now?

Fahim Sabir, Colt Technology Services Ltd (12:58):
So not in any conversations that I've been involved in. I'm pretty sure there's always conversations happening at Coles, so I'm pretty sure somebody will have. And then, yeah, I guess the question is where is that GPU? I mean, you can have it on the customer's site in, I am careful not to mention UCPs and the such because that turned out really well. But I mean, is it going to be the next generation of UCP type technology edge type technology or the such? But yeah, you can definitely see it happening because you need that sort of power to process that sort of quantity of data.

Ray Le Maistre, TelecomTV (13:37):
Yeah. Well, things are moving pretty fast. So just in the past year, the number of the use cases and business models around this have evolved pretty quickly. So I'd expect by the end of 2024, we'll have a completely new set of services, a lot of new ways of working and a lot of different demands from the enterprises as well. So look forward to catching up later in the year and finding out what's been affecting. Col of course, is a much bigger company now as well. There's a major acquisition, so you've got greater scale, more and more customers to engage with. So that must be pretty exciting in itself.

Fahim Sabir, Colt Technology Services Ltd (14:24):
Yeah, I mean, at the moment it is pretty much all hands to the pump trying to basically move forward with the integration. I mean, the integration is at multiple levels. So there's a cultural integration at the people level, which is regarded in our organization as being the most important part. A lot of acquisitions, mergers fail because you've got two separate cultures coming together and not enough work is done to actually put it together. But then also at the technical levels, integrating the networks, integrating systems, well actually migrating data into Colt's core systems. So we have one network, one core set of systems, one team basically coming together. And as you can imagine, there is a lot of energy going into that activity, and I think we're making good progress. I think we're on plan

Ray Le Maistre, TelecomTV (15:14):
On that. Well, I think you've got a management team that's very focused on all of those issues and not leaving anything behind. So that seems to be the way things are working. Fahim great to talk to you. Thanks so much for joining us.

Fahim Sabir, Colt Technology Services Ltd (15:25):
Lovely to see you.

Ray Le Maistre, TelecomTV (15:26):
And let's speak again later in the year and find out how things have evolved.

Fahim Sabir, Colt Technology Services Ltd (15:29):
Yep, let's do that. Thanks very much.

Please note that video transcripts are provided for reference only – content may vary from the published video or contain inaccuracies.

Fahim Sabir, Director, Network On Demand, Colt Technology Services Ltd

Network API exposure is all the rage right now, but Colt has been working with the likes of AT&T and Orange for years on enabling automated data service provisioning through standard APIs, explains Fahim Sabir, director of network on demand at Colt Technology Services. He also discusses virtualisation, cloud-native processes, how the increasing use of AI in enterprises impacts Colt and the company’s expansion following its recent acquisition of Lumen’s EMEA operations.

Recorded April 2024

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.