Sunday, 28 July 2019

EDGE COMPUTING




               Edge Computing


Computing workloads are increasing across industry, from the manufacturing plant producing custom springs to the IoT television streaming Netflix. As the growth in network traffic increases, data center infrastructure, and networking costs have ballooned. The rise of enormous centralized data centers – or server farms – has rocketed companies like Amazon and Microsoft to the forefront of the technology sector. That growth comes at a cost, though, both to those behemoth companies and the SMBs who rely on Amazon Web Services (AWS) and Microsoft Azure cloud computing services. One solution? Edge computing.
Edge computing moves some of the computational needs away from the centralized point to the logical geographic nodes “at the edge” of the network, close to where the computing is needed. Edge computing increases the performance of applications and relieves increased bandwidth requirements from the core network. A recent report showed potential improved latency and data transfer reduction to the cloud of up to 95%.

Simply put: edge computing reduces data center costs by enabling more efficient use of cloud computing architecture.                                                                                                  
How Does Edge Computing Work?
 Edge computing is used by different people to do different things, and the way it works varies depending on its use. Most people think of edge computing and the Internet of Things (IoT). For every Office 365 email and Amazon Echo request, your device must resolve, compress, and transfer that information to the cloud, whereby it is received, decompressed, processed – possibly through another API – and then transferred back to you. And that takes time. We refer to the time that process takes as latency.
Edge computing enables data to be analyzed, processed, and transferred at the edge of the network. This distributed architecture is what makes IoT and mobile computing functional. The device you use, or a local server can process the data instead of sending it to a centralized data center, saving time and improving performance.
Why Edge Computing?

The edge computing reduces latency, provides near real-time data analysis, and reduces overall data traffic.
The long version: Everyone benefits. Whether you’re an oil tycoon analyzing real-time data uploads from your network of deep sea oil rigs or a hardcore Twitch Fortnite gamer streaming video of your best solo round ever, latency (see: lag) and delayed data transfer have real impacts. By processing data as close to the end user as possible, data computing and content delivery happens much more quickly.

Can Edge Computing Reduce Data Center Costs?

Edge computing delivers better bandwidth and more computing power. Backup and disaster recovery strategies, customer contact channels, and access to mission-critical applications are just as important to billion-dollar corporations as they are to a small healthcare provider in a second- or third-tier city. In the face of such technological conversations as net neutrality and micro-multinational business growth, the opportunity for businesses of any size to colocate in local, edge data centers is essential to the continued growth of our robust economy.



No comments:

Post a Comment