What is edge computing? Why is it important | what is its future?

Edge computing is what it sounds like: computing that takes place at the edge of corporate networks, with “the edge” being defined as the place where end devices access the rest of the network – things like phones, laptops, industrial robots, and sensors. 

The edge used to be a place where these devices connected so they could deliver data to, and receive instructions and download software updates from a centrally located data center or the cloud. 

Now with the explosion of the Internet of Things, that model has shortcomings. IoT devices gather so much data that the sheer volume requires larger and more expensive connections to data centers and the cloud.

The nature of the work performed by these IoT devices is also creating a need for much faster connections between the data center or cloud and the devices. 

For example, if sensors in valves at a petroleum refinery detect dangerously high pressure in the pipes, shutoffs need to be triggered as soon as possible. With analysis of that pressure data taking place at distant processing centers, the automatic shutoff instructions may come too late. 

But with processing power placed local to the end devices, latency is less, and that roundtrip time can be significantly reduced, potentially saving downtime, damage to property, and even lives. 

Even with the introduction of edge devices that provide local computing and storage, there will still be a need to connect them to data centers, whether they are on-premises or in the cloud. 

For example, temperature and humidity sensors in agricultural fields gather valuable data, but that data doesn’t have to be analyzed or stored in real-time. 

Edge devices can collect, sort, and perform a preliminary analysis of the data, then send it along to where it needs to go, to centralized applications or some form of long-term storage, again either on-prem or in the cloud. 

Because this traffic may not be time-sensitive, slower, less expensive connections possibly over the internet – can be used. And because the data is presorted, the volume of traffic that needs to be sent at all may be reduced. 

So the upside of edge computing is the faster response time for applications that require it and the slowing of the growth of expensive long-haul connections to processing and storage centers.


The downside can be security. With data being collected and analyzed at the edge, it’s important to include security for the IoT devices that connect to the edge devices and for the edge devices themselves. 

They contain valuable data, but they are also networking elements that, if exploited, could compromise other devices that contain stores of valuable assets. With edge computing becoming more essential, it’s also important to make sure that the edge devices themselves don’t become a single point of failure. 

Network architects need to build in redundancy and provide failover contingencies in order to avoid crippling downtime if a primary node goes down. 

The industry has already gone a long way toward addressing the demands of edge computing, and it is becoming mainstream. 

Its importance is likely to grow even more as the use of real-time applications becomes more prevalent.

So, guys, I meet you with my next blog and you have any suggestions for me tell me in the comment section.


Thank you very much for giving me your important time 

GOODBYE. 

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.