Fullscreen User Comments
Share on Twitter Share on Facebook Share on LInkedIn Share on GooglePlus

Loading…

Loading…

Loading…

Loading…

Loading…


BLOG: taking IoT to the edge

1920-1080_Fog_image(2)

  • 'Fog Computing' will answer real capacity and latency problems
  • Why the Fog trend should be the CSP’s friend
  • CSPs can either host edge computing or provide it as part of a service

Fog: It’s a catchy IT name for the process of distributing processing and storage back out towards the edge of the network where, its protagonists argue, it is needed to rebalance the cloud architecture. The current general purpose Cloud IT model, with end devices attached directly to a central data centre is not universally optimal as applications become more demanding - especially in the Internet of Things (IoT) domain where latency and sheer data volume are projected to become major issues. Fog Computing is part of the answer.

The good news is that the trend should be the CSP’s friend. Fog needs a network edge to play from and that offers access network operators the opportunity to provide compute facilities connected to their communications networks.

That’s the Fog pitch, but how well does it map onto IT and network reality? Is this really just a “Hey, remember me!” strategy for box vendors and CSPs who feel that corporate enthusiasm for pure Cloud is elbowing them out?

No is the answer. There are some real Cloud challenges looming and Fog computing, in one form or another, looks like being part of the solution.

Solutions to round trip delay and sheer volume

Metaphorically, we can think of the ‘Cloud’ as high up and thereby able to serve a ‘footprint’ of millions of end devices from horizon to horizon. On this basis Fog describes a thinner layer of resources much closer to the ground (with a narrower purview) but able to serve some applications better by being a short ‘hop’ or two away and therefore more responsive to the end system.

For some applications - especially in IoT - Fog might also find itself performing triage on data straining to get to the cloud by doing some preliminary sifting to analyze and perhaps distil, throwing out the unnecessary or repetitive.

So the first job for Fog is to solve the round trip delay problem for some of the data heading from the edge to the middle of the cloud. Until somebody finds a way to beat the speed of light we are stuck with long-distance fibre transmission delay which, the experts say, has to be overcome if the promise of things like driverless cars or remote surgery are to come to fruition. The only way that the necessary latency can be achieved for these applications is if the journey between end device and the critical data it interacts with is right at the edge of the network - enter Fog computing.

Perhaps the biggest long-term challenge to the central ‘Cloud’ as we currently understand it, is IoT. At the moment we’re imagining billions of ‘things’ just popping up at long intervals to send a few bytes of data. But the fact is that not all applications are going to be so undemanding. It’s already possible to see today’s simple, telemetry-style applications being beefed up to return more and more data over time.

Take the humble domestic boiler. It can be rigged up to return stats on its power usage and can even be controlled to maximise efficiency and reduce bills. All well understood as a worthy metering application today. But what if it could return a constant stream of information on the state of the boiler via sensors? That might enable a central system to use big data analysis (feeding in data from all the boilers of the same model) to be able to predict from tell-tale signs (vibration, overheating, lowered pressure) an imminent failure and to replace the boiler before its owner even knows there’s a problem.

But that beefed up boiler app will generate a vast amount of data - perhaps several state notifications a second - to identify the fatal pattern. Multiply that by thousands of boilers and the data volume could overwhelm both the network and the cloud storage and compute facility. However, Fog facilities could aggregate a few hundred boilers at a time, distilling the data for each boiler and perhaps forwarding only the ‘exceptions’ to the central Cloud with network overload thus prevented.

Oscillating gently

It’s not that ‘cloud’ is somehow ‘wrong’ or is going to be replaced. Fog is just one more response to the continually shifting balance of advantage between centralized storage and processing (economies of scale, ability to analyze huge data sets, processing flexibility) and distributed computing and storage (local control, reduced network costs, increased responsiveness).

That oscillation began when the first mainframe computers spawned disruptive mini computers, decentralizing processor power and storage to the departmental level. It’s a process that accelerated again with the advent of the PC. Then it turned around and headed back towards the center with client-server computing. Now with Cloud we’re close to mainframe style ‘peak central’ again, so the advent of Fog might be seen as the latest oscillation away from the center and towards the edge as the underlying technologies and applications change flavour again.

How might CSPs benefit from the latest oscillation?

Many Fog scenarios will involve Customer Premises Equipment (CPE) such as on-premises servers which might play the primary data collection role for an estate of sensing devices (say). CSPs are clearly already in the CPE game and where ‘edge of network’ actually means customer premises they are well placed to play a provisioning role.

But perhaps more importantly, CSPs can use their distributed network facilities to either host edge computing or provide it as part of a service. Options here include old central offices/local exchanges, secure street cabinets or (if a mobile operator) on Radio Access Network (RAN) poles and towers. Any - and probably all - of the above are likely to be pressed into service.

So the good news for CSPs is that though Cloud will certainly deliver businesses and consumers ease of access to always-on applications, compute power and storage along with reduced costs, Edge or Fog compute will be required to deliver the last mile in performance and efficiency.

This blog is the fruit of a discussion betwen Ian Scales, Managing Editor, TelecomTV and Brent Hodges, Internet of Things (IoT) Planning and Product Strategy and Open Fog Board Member at Dell.

 

Join The Discussion

x By using this website you are consenting to the use of cookies. More information is available in our cookie policy. OK