Fog Computing

Home / Evangelists / Fog Computing

Even though Cloud computing is a great way of processing the data generated by the “things”, it doesn’t meet all IoT’s needs. For instance, one issue that affects the quality of service (QoS) severely is network latency. Real time applications are affected by the delay caused by latency in networks [1].

For example, when the temperature in a chemical vat is fast approaching a defined threshold, corrective actions must be taken on the fly. In the time it takes for the sensor readings to travel from the edge to the cloud for analysis, the opportunity to avert a disaster can be lost.
As Cloud computing simplifies specifications, this becomes a problem for real time applications. To solve the problems that cloud computing cannot answer, a new model was introduced by Cisco Systems – “Fog Computing” [1].
Fog computing, because the fog is a cloud close to the ground, extends Cloud computing and services to the edge of the network. Services are hosted at the network edge or even end devices such as set-top-boxes or access points. By doing so, Fog computing reduces service latency, and improves QoS, one of the biggest problems of Cloud computing [1].
Fog computing has the advantage of enabling a single, powerful processing device (Fog node or IoT gateway) to process data received from multiple end points and send information exactly where it is needed. It offers lower latency than a cloud processing solution where data would be sent and received from the cloud. Information is transmitted to this device from various sources in the network where it is processed and the data, as well as any additional commands, is transmitted back out to the concerned devices [2].
Fog computing supports mobility, computing resources, communication protocols, interface heterogeneity, cloud integration and distributed data analytics to addresses requirements of applications that need low latency with geographical distribution [3].

 

Fig.1. Fog Computing

In an overview published by Cisco, we have a brief description of what happens in the Fog nodes and in the Cloud platform:

What happens in the Fog nodes?

  • Receive feeds from IoT devices using any protocol, in real time.
  • Run IoT-enabled applications for real-time control and analytics, with millisecond response time.
  • Provide transient storage, often 1–2 hours.
  • Send periodic data summaries to the cloud.

The cloud platform:

  • Receives and aggregates data summaries from many fog nodes.
  • Performs analysis on the IoT data and data from other sources to gain business insight.
  • Can send new application rules to the fog nodes based on these insights. [4]

Instead of having everything centralised into the cloud, Fog computing deputises nodes to do distributed processing using localisation as a factor.

Cisco also describes the advantages of using Fog computing:

  •  Minimise latency: Milliseconds matter when you are trying to prevent manufacturing line shutdowns or restore electrical service. Analysing data close to the device that collected the data can make the difference between averting disaster and a cascading system failure.
  •  Conserve network bandwidth: Offshore oil rigs generate 500 GB of data weekly. Commercial jets generate 10 TB for every 30 minutes of flight. It is not practical to transport vast amounts of data from thousands or hundreds of thousands of edge devices to the cloud. Nor is it necessary, because many critical analyses do not require cloud-scale processing and storage.
  •  Address security concerns: IoT data needs to be protected both in transit and at rest. This requires monitoring and automated response across the entire attack continuum, before, during and after.
  •  Operate reliably: IoT data is increasingly used for decisions affecting citizen safety and critical infrastructure. The integrity and availability of the infrastructure and data cannot be in question.
  •  Collect and secure data across a wide geographic area with different environmental conditions: IoT devices can be distributed over hundreds or more square miles. Devices deployed in harsh environments such as roadways, railways, utility field substations and vehicles might need to be rugged. That is not the case for devices in controlled, indoor environments.
  •  Move data to the best place for processing: Which place is best depends partly on how quickly a decision is needed. Extremely time-sensitive decisions should be made closer to the things producing and acting on the data. In contrast, big data analytics on historical data needs the computing and storage resources of the cloud. [4]

Fog computing is better adjusted to respond to some of the demands where the cloud fails to give an answer. But it cannot totally replace cloud computing which is better suited to high end batch processing jobs that are very common in the business world. Instead they complement each other while having their own advantages and disadvantages.

To help spread Fog computing, the OpenFog Consortium was formed in November of 2015 based on the premise that an open architecture is essential for the success of a ubiquitous fog computing ecosystem for IoT platforms and applications [5].

References

  1. IoT, from Cloud to Fog Computing, http://blogs.cisco.com/perspectives/iot-from-cloud-to-fog-computing
  2. How does fog computing differ from edge computing?, http://readwrite.com/2016/08/05/fog-computing-different-edge-computing-pl1/
  3. Díaz, M., Martín, C., Rubio, B.: State-of-the-art, challenges, and open issues in the integration of Internet of things and cloud computing, Journal of Network and Computer Applications 67 (2016) 99–117.
  4. Fog Computing and the Internet of Things: Extend the Cloud to Where the Things Are, http://www.cisco.com/c/dam/en_us/solutions/trends/iot/docs/computing-overview.pdf
  5. OpenFog Architecture Overview, www.OpenFogConsortium.org
Ricardo Santos
Integrating the world for over 10 years and enthusiastic about the Internet of Things. I help to spread the word at Polarising about the future that is happening today. Martial artist and History nerd, hoping technology will help us get where we need to go.
Recommended Posts
  • Edge Computing
    Edge Computing
    Edge computing takes localised processing a bit further than Fog Computing, because it allows for actions to be taken on-site, in the processing point. This poses an advantage over Fog Computing as there are less points of failure. Each item in the chain is more independent and capable of determining what information should be stored […]
  • AWSome Day @Lisbon, the Amazon Web Services Event
    AWSome Day @Lisbon, the Amazon Web Services Event
    Last May 18th, AWS (Amazon Web Services), pioneer and leading provider of cloud services, held the AWSome Day event in Lisbon, a full day tour of its services. Since Polarising is keeping a close watch on the cloud space, I was very interested to hear what they had to say and the maturity of AWS’ […]
  • Lambda Architecture
    Lambda Architecture
    In the Lambda Architecture website we have a brief history and description of the architecture. “Nathan Marz came up with the term Lambda Architecture (LA) for generic, scalable and fault-tolerant data processing architecture, based on his experience working on distributed data processing systems at Backtype and Twitter. The LA aims to satisfy the needs for […]
  • Cloud Computing
    Cloud Computing
    The definition of Cloud computing provided by the National Institute of Standard and Technologies says: ‘‘Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing re-sources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service […]

Leave a Comment