What is fog computing and why is it so important for the development of IoT applications. What are the prospects related to 5G networks? Fog Computing, or Fogging, a distributed architecture is designed to ensure the availability of network resources, storage and, computational capacity throughout the entire infrastructure that connects the IoT to the Cloud optimizing the performance of next-generation applications, strengthen and broaden the benefits of 5G. But what exactly is fog computing and how is it different from edge computing and the cloud?
Table of Contents
What Is Fog Computing
Fog Computing is a fairly recent technology conceived as an extension of the Cloud towards the periphery of the corporate network and implemented through virtualized platforms. The Fog infrastructure is complex and within it coexist wireless and mobile technologies and broadband fiber connections, to form an intermediate level of abstraction between IoT devices and the cloud.
With Fogging it is possible to reduce the flow of data that crosses IP networks and run applications locally by continuously feeding them with feeds generated by connected sensors and objects. In practice, instead of uploading all the acquired data to the Cloud and transferring it to a center for the appropriate processing, it is possible to process the most relevant records in a network node near the point where they are produced, significantly improving datacenters equipped analytical applications.
Fog nodes are mini centers equipped with all the necessary computing resources and storage, achievable through devices such as access points, routers, gateways, and servers managed by users. The same nodes are also able to send aggregate information to the Cloud, usually at set intervals, and receive alarms and triggers in return, to meet certain operational requirements.
What Is Cloud Computing
Equipped analytical Cloud Computing is an architecture for equipped analytical data processing that relies on a centralized data center, in which memory and computing power are merged and shared by the different backend servers.
This architecture is very common in corporate IT environments, because it allows the management of storage resources and computing power economically and effectively, but it demonstrates its limitations in application contexts that require data processing in real-time. The main problems are related to latency time, whereby centralized data processing leads to delays due to long transmission paths and traffic congestion on IP networks.
Time Complement Real-Time
In Edge Computing, data processing is not only distributed and decentralized but takes place directly on the devices that generated them, at the edge of the corporate network. Each IoT device and machinery in this case is equipped with a microcontroller, a compact and miniaturized computer contained within a chip, which allows processing the main data, communicate with other sensors and smart devices.
This allows you to manage applications in real-time even in contexts where fixed or mobile network coverage is limited, overcoming latency problems. With Multi-access Edge Computing Then, you can further maximize the performance of next-generation applications based on positioning, real-time data analysis, or advanced graphics.
Before the advent of Mobile Edge Computing, the data of mobile and IoT applications had to transit from the device (a smartphone or a sensor) to the mobile network to the devices in the edge and then again to the IP network, to reach the hosted application servers. in the company data center.
By placing processing and storage services directly on the edge of the 4G and 5G mobile network, other hands, it is possible to more effectively support instant decision-making processes, offer optimized user experiences, and contextualized services of georeferencing, marketing, and video analytics.
Fog, Edge, And Cloud, What Are The Main Differences
The Cloud ensures flexible and virtually infinite processing and storage capacities and is the ideal choice for Big Data Analytics or the analysis of large amounts of data. Edge Computing and Fog Computing, on the other hand, are the solutions of excellence for applications that require downtime or response times close to zero.
Two-way communication with a server hosted in the Cloud can take several minutes while the interaction with the Fog nodes is resolved in fractions of a second. Cloud Computing requires 24-hour access to the Internet, while Edge and Fog can also be implemented in environments where connectivity to the IP network is discontinuous.
Here because Fog Computing and Edge Computing do not seem destined to supplant the Cloud, rather a real-time complement it to ensure lower latency on critical applications.
Fog Computing And 5G
5G use cases and applications are combined with some typical features of the Cloud, such as resource scalability and pervasive connectivity. However, the deployment of IoT applications introduces new requirements – such as low latency or support for mobile communications, access to information on the precise location of users and objects – that current Cloud environments cannot adequately support.
To meet these needs, a new extended cloud concept has been introduced, which allows processing capacity to be moved as close as possible to the data source. 5G ensures latency times of less than a millisecond, however, with the Cloud, the benefits of the new mobile communication standard may not be fully exploited.
The evolution of the Cloud towards Fog Computing and Edge Computing coincides precisely with the movement of logical and physical resources towards the periphery of the network, up to the individual endpoints.
The integration of Edge and Fog systems in the 5G infrastructure is now essential to further reduce the latency times of this mobile communication technology. With Fog Computing, this problem is overcome and 5G becomes usable in various IoT applications, from smart home to smart health, from smart security to intelligent agriculture.
Fog Computing And IoT
Fog Computing is the infrastructure of choice for many innovative applications, which require locally distributed data processing: IoT, Artificial Intelligence and Machine Learning, virtual reality, georeferencing? Applications that require low latency and wide transmission bandwidth, position sensitivity, and always-on connectivity.
Smart manufacturing and Industrial IoT applications, for example. In the smart factory and connected production is based on “zero defects” logic, which requires the ability to intervene in real-time on the production line in the face of any discrepancy or problem, for example by activating the automatic shutdown commands of machinery and systems.
Other applications that benefit from the ability to process data in a distributed way are smart grids, or the new sustainable and intelligent electricity grids, which accumulate unused energy to dynamically redistribute it when it is needed by entire neighborhoods and cities. In the self-driving cars of the Internet of Vehicles traffic, direction, and speed data are processed directly and in real-time on the vehicle.
The support of the Fog Computing infrastructures is therefore essential to ensure that the automatic braking system, for example, can be activated within milliseconds. And again, digital medicine applications, with the opportunity to perform surgery remotely, and Smart Security, with smart cameras that are widespread in cities, production sites, and retail outlets that require local processing power to process the acquired data in real-time and, if necessary, generate alarms.