EDGE COMPUTING: THE NEW ERA

Facebook
Twitter
LinkedIn
WhatsApp
Email

It all started with one big computer coming into existence. It changed everything – the way data was transferred, the way it was processed, and most importantly, the capability of machines. Slowly, more computers were built, but these were huge, super-expensive machines that were not for daily use. Finally, when personal computers were invented, common people too had access to computers and hardware like floppy-disks, hard-disks, and microprocessors. Everything progressed with time, computers became more and more powerful, but one problem remained unsolved – storage.

With the growth and increased usage of computers and hardware, high-end processing power and mass storage became a big issue. The internet-connected all the devices together, but data transmission and data processing still were unresolved because the number of users and the number of processes going on simultaneously were too huge for local databases to handle. Therefore, huge databases for data transmission and processing were constructed at particular locations. The data generated by computers and other devices would be transferred to these databases where processing could take place at high rates. The resulting data would then be transferred back to the device and displayed to the user.

The problem with this system was its bulky size and the fact that if the database goes down, the computers dependent on it would get stuck. This led to the development of cloud storage and the beginning of Cloud Computing. A cloud is basically a large mass of data over the internet which can be accessed remotely through any device connected to the internet. Cloud storage is immense and a lot faster than local databases. And now, with the introduction of 5G, cloud storage will give an even faster response. But, here lies the modern-day problems – there are just too many devices connected to the internet, that is, IoT devices and the cost of bandwidth is off the charts. To solve these problems, Edge Computing kicks in.

As the name signifies, the part of computing which involves the processing of data near the edge of the computing infrastructure, that is, near the device producing the data, is referred to as edge computing. To understand this concept properly, let’s take an example. Consider an online video meeting wherein every person is sharing their live video. Now, as the number of people increases, the video quality displayed to each person decreases, and the bandwidth cost for the software host increases. This happens because every bit of data needs to be either transferred to the database or uploaded to the cloud. To prevent this from happening, edge devices have been introduced. These devices are basically local processing units near the source code providing device, that is, the user’s system.

Edge devices create an edge-gateway which is like a virtual semi-permeable membrane. The local device transfers its data to an edge device and the edge device processes that data. Then, it transfers only the required data to the database or the cloud and returns the processed data back to the user. With every user having their own local edge devices, the speed to the transmission of processing increases significantly and the overall bandwidth cost decreases as well. Hence, the major problems are tackled efficiently.

As of 2020, the trending buzzwords are 5G, IoT, AI, Quantum Computing, and Edge Computing. Computers are advancing and so are we. And now, with the introduction of edge computing, we have leaned one step closer to achieving the impossible!

Related Posts: