TECHNOLOGY

What Is Edge Computing And How Is It

Edge Computing

Edge computing is a distributed and decentralized IT architecture, characterized by a series of perimeter data centers

Edge computing is a distributed and decentralized IT architecture, characterized by a series of perimeter data centers. These on-premises data centers process and store critical information locally, transmitting all received and/or processed data to a central data center or cloud storage repository. By leveraging the

Market availability of decreasing cost Small Form Factor (SFF) electronic components and systems, this network topology brings the core to compute, storage, and networking components closer to the sources that generate the data. The typical case is that of I0T devices and implementations that often face problems of latency, lack of bandwidth, reliability and are not manageable with a conventional cloud model.

Edge technologies reduce the amount of data to be sent to the cloud by processing critical, latency-sensitive data at the point of origin via a smart device, or by sending it to an intermediate server located nearby. 

Less time-sensitive data can be transmitted to the cloud infrastructure or to the company’s data center to allow more complex processing such as, for example, big data analysis, training activities to refine the learning of Machine algorithms Learning (ML), long-term storage, or data analysis.

The Need For Perimeter Processing Is Growing

The data that companies depend on to monitor their performance, understand their customers, and make decisions is growing in every direction. According to Gartner, 75% of corporate data will be generated outside of the cloud and core environments by 2025, a significant increase from 10% in 2017.

A Proliferation Of Intelligent And Distributed Endpoints

The Internet of Things needs a new level of connectivity both to manage millions of data points and to manage an increasing amount of smart objects. The pervasive diffusion of IoT means that more and more data is generated, processed, and analyzed at the source (the so-called edge) whether it is an employee’s mobile phone, a smart sensor in a factory, a touchscreen device in a shop, or a robot used in a hospital.

The reason data is moving to the edge is related to the need to use it in real-time. Even sub-millisecond latency with respect to sending and receiving data from the cloud can compromise the effectiveness of a system.

A set of intelligent traffic lights, a particular component on a production line as well as a monitoring device in an intensive care unit must have a processing system that makes the data immediately actionable.

The levels Of Adoption Of Edge Computing

According to analysts at Vanson Bourne, almost 8 out of 10 IT decision-makers (78%) worldwide who already use edge technologies in production claim to be able to use quality data that improves their decision-making and business processes. 

With respect to the level of adoption, 42% of the sample is still in the pilot phase while the percentage of those who intend to start pilot projects over the next year is around 31%. IT decision-makers choose edge computing for several reasons, primarily to improve security, visibility, and customer experience. 

At the sector level, the most common cases of use of the edge concern the tracking and monitoring of individual objects along the supply chain in the retail sector (51%), the use of face recognition in the hotel and hospitality sector (49% ) as well as improving the experience of healthcare facilities by means of always-on tools and applications (49%). Over half (55%) of IT decision-makers recognize that the optimization associated with Machine Learning and Artificial Intelligence on their respective networks is linked to the need for much faster data processing.

6 Edge Computing Challenges To Consider

IT managers are aware of how real-time data analysis closer to the edge produces higher levels of efficiency and insights which, in turn, lead to better business outcomes. 

However, adopting these models requires careful consideration of some elements. These are the main ones:

1. Network Bandwidth

Network bandwidth changes as businesses move to the processing and data perimeter. Businesses typically allocate more bandwidth to data centers and less bandwidth to endpoints. Edge computing, on the other hand, determines the need for more bandwidth across the network.

2. Distributed Computing

Compared to the computing power, the edge location must be considered as an additional aspect. The consolidated computational models are leaving the network a key role in the elaboration. The edge infrastructure, in fact, must be sized appropriately. Computing distributed in a perimeter micro data center, for example, can be as resource intensive as in a centralized data center.

3. Latency

In edge computing, the computation is closer to the collected data so the latency of the application is reduced similarly to the latency of the decision-making process. Fewer back and forth movements from the edge to the center mean faster responses and more timely actions which, thanks to 5G, will result in real-time.

4. Control and management

Within an organization, the edge location can be flexible. You can include a private cloud or a public cloud, but management and control must follow the same procedures and protocols, regardless of the edge location of the physical edge data center. Ideally, companies will use next-generation orchestration tools to help manage and control applications consistently, regardless of location.

5. Scalability

Adding more edge-connected devices increases the overall economy of scale that IT teams work on. Edge computing is not just about the number of perimeter servers but also about scaling at the compute, network, storage, management, security, licensing levels, and soon. Businesses need to understand this when moving applications to the network perimeter. In fact, perimeter data centers don’t just have multiple hardware in a remote location. The impact extends to the entire ecosystem of resources involved.

6. Backup

The edge computing model is typically driven by where the data is generated. Businesses need a global strategy for securing this information, with an understanding of the data, independent of their location. Network bandwidth requirements will be just as critical as storage media assessments. In fact, to protect these resources, it makes no sense to consider backup policies on the network.

Also Read: New Office, New Network: 7 Tips For A Top Connection

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *