Edge computing is not a new concept, but over the past few years, several trends have come together including IoT and immersive environments to create an opportunity to help organizations streamline the flow of massive amounts of data closer to the source of the data and also provide real-time analysis of the same local data.
While cloud computing is still very much here, the very notion of it still looks somewhat impractical. As far as edge computing is concerned, it seems to have taken a huge leap in bringing a considerable change in the network system and is being considered as an undeniable reality.
With edge computing at the disposal, the data produced by internet of things [IoT) devices get processed closer to the source – the edge of the network – where it is created instead of being sent across long paths to data centers or clouds. This type of computing where data is processed close to where it is being created allows organizations to analyze the important data in real-time. This computing is critical because analyzing the data in real-time close to the network is a desperate need of organizations spread across many industries like manufacturing, finance, telecommunications, and healthcare.
To put in Helder Antunes, senior director of corporate strategic innovation at Cisco, words, ‘In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that’s just not realistic,” says Helder Antunes, senior director of corporate strategic innovation at Cisco.
What is Edge Computing and How Does It Work?
To further elaborate what edge computing really is, the research firm IDC has said to think of it as a mesh network of micro data centers, processing or storing critical data locally and pushing all the received data to a chief data center or cloud storage source where the footprint is less than 100 square feet.
Typically, the concept of edge computing comes into being when IoT devices are involved. These IoT devices would collect massive amounts of data and send it to all the central data center or cloud for processing. Later, edge computing enables the data to get processed locally, near the source or edge of the network, so that the backhaul traffic to the central source is reduced.
Benefits of Edge Computing
The very notion of edge computing was created to benefit the network in a lot of ways. The placements of edge computing work the best way in a variety of circumstances. One of the benefits of edge computing deployment is when IoT devices have poor connectively and are not efficient enough to stay constantly connected to a central cloud system.
The other benefit revolves around the latency-sensitive processing of data. With edge computing, latency is reduced to a considerate level because the data does not have to pass through a network for reaching a data center or cloud for processing. This works best for situations where latencies of milliseconds can be unsustainable, especially in the finance and manufacturing sectors.
One of the examples of an edge computing placement is described as follows:
An oil rig in the ocean, having thousands of sensors, produces massive amounts of data, most of which could be insignificant. The data that is inconsequential does not necessarily need to be traversed over a network as soon as its produced, therefore the local edge computing system gathers the data and sends daily reports to a central data center or cloud system for long-term stowage. By only sending significant data to the data center or cloud, the edge computing system cuts down the data crossing over the network.
Telecommunication companies can also deploy edge computing in the next-gen 5G cellular networks. Kelly Quinn, research manager at IDC who is an expert at edge computing, has predicted that as major telecom providers will begin building 5G into wireless networks, they will progressively add micro-data centers that are either united or located end-to-end to the 5G towers. This will enable business customers to own or rent space in these micro-data centers to undergo edge computing and have a direct access to a gateway into their telecom provider’s broader cellular network, which could connect to a public IaaS cloud provider.
Examples of Edge Computing Deployment
The leading technology giants have already started incorporating edge computing system in their systems. Apple uses special chips in their new iPhones to keep data authentication on the device, and off Apple servers. It serves as a huge privacy bonus for Apple because it keeps the company out of court when the Feds inexorably come demanding for unlock codes.
Google is also not behind in deploying edge computing. The tech-giant uses its own custom chips for intricate machine learning. The new Pixel phones have been designed to process image data directly on the device itself. The outcome is faster processing and best photos from any other smartphone device.
How Different is Edge Computing from Fog Computing?
As the IoT is evolving, the rise of edge computing is also becoming inevitable. Gradually, edge computing market is taking shape and also giving rise to another new exciting development which is known as ‘fog computing’.
The word ‘fog’ refers to the network connections between the IoT – the edge devices and the cloud. As far as the edge is concerned, it refers more specifically to the data processing and analyzing being done close to the edge devices. In other words, fog computing includes edge computing, but fog also incorporates the network required to get the processed data reach its final destination.
Fog computing, also known as fogging, brings the benefits of the network closer to the edge. The network designers ensure data is being collected and analyzed at most resourceful and logical places existing between the source and the cloud. With fogging, less data traverses back to the cloud for processing. Fogging is faster and more efficient as compared to edge computing, allowing the data get processed quickly.
Many technical experts have predicted that in the coming years’ edge computing could completely displace the cloud computing. However, Mung Chaing, dean of Purdue University’s School of Engineering and co-chair of the OpenFog Consortium, has a faith that a single computing domain will not dominate; there will be a continuum. Edge and fog computing will work best when real-time analysis of the field data is needed.
Is Edge Computing Secure?
Two schools of thoughts are said to exist when it comes to edge computing security. Some believe that security in an edge computing environment is hypothetically better as the data is not navigating over a network and is staying close to its source. The lesser the data in a corporate data center or cloud, the lesser it will become vulnerable in case of those data sources are hacked.
On the other hand, some argue that edge computing is integrally less secure than any other computing system because the edge devices themselves can be susceptible to hacks. Security should always be the dominant factor when it comes to deploying edge of fog computing.
Important security measures like access control, data encryption, and use of VPNs are important elements in shielding the edge computing systems.
The bottom-line is the world of information technology is continuously evolving. It’s important for us to keep our eyes open and see what’s on the horizon. We wouldn’t deny the benefits cloud computing has given us but let’s not also forget that future is the edge.