Sunday, 28 July 2019

EDGE COMPUTING




               Edge Computing


Computing workloads are increasing across industry, from the manufacturing plant producing custom springs to the IoT television streaming Netflix. As the growth in network traffic increases, data center infrastructure, and networking costs have ballooned. The rise of enormous centralized data centers – or server farms – has rocketed companies like Amazon and Microsoft to the forefront of the technology sector. That growth comes at a cost, though, both to those behemoth companies and the SMBs who rely on Amazon Web Services (AWS) and Microsoft Azure cloud computing services. One solution? Edge computing.
Edge computing moves some of the computational needs away from the centralized point to the logical geographic nodes “at the edge” of the network, close to where the computing is needed. Edge computing increases the performance of applications and relieves increased bandwidth requirements from the core network. A recent report showed potential improved latency and data transfer reduction to the cloud of up to 95%.

Simply put: edge computing reduces data center costs by enabling more efficient use of cloud computing architecture.                                                                                                  
How Does Edge Computing Work?
 Edge computing is used by different people to do different things, and the way it works varies depending on its use. Most people think of edge computing and the Internet of Things (IoT). For every Office 365 email and Amazon Echo request, your device must resolve, compress, and transfer that information to the cloud, whereby it is received, decompressed, processed – possibly through another API – and then transferred back to you. And that takes time. We refer to the time that process takes as latency.
Edge computing enables data to be analyzed, processed, and transferred at the edge of the network. This distributed architecture is what makes IoT and mobile computing functional. The device you use, or a local server can process the data instead of sending it to a centralized data center, saving time and improving performance.
Why Edge Computing?

The edge computing reduces latency, provides near real-time data analysis, and reduces overall data traffic.
The long version: Everyone benefits. Whether you’re an oil tycoon analyzing real-time data uploads from your network of deep sea oil rigs or a hardcore Twitch Fortnite gamer streaming video of your best solo round ever, latency (see: lag) and delayed data transfer have real impacts. By processing data as close to the end user as possible, data computing and content delivery happens much more quickly.

Can Edge Computing Reduce Data Center Costs?

Edge computing delivers better bandwidth and more computing power. Backup and disaster recovery strategies, customer contact channels, and access to mission-critical applications are just as important to billion-dollar corporations as they are to a small healthcare provider in a second- or third-tier city. In the face of such technological conversations as net neutrality and micro-multinational business growth, the opportunity for businesses of any size to colocate in local, edge data centers is essential to the continued growth of our robust economy.



Sunday, 7 July 2019



NINE TECHNOLOGY TRENDS IN 2019


                     2019 is almost here and with it a flood of lists describing the trends that will define various fields in the new year. From among these predictions, those related to new technological standards stand out first and foremost, given that they will end up revolutionizing every industry, in an age when digital transformation plays a major role. After evaluating various consulting firm reports, we conclude that these are the nine major trends that will define technological disruption in the next 365 days.

1. 5G Networks

Spain’s National 5G Plan for 2018-2020 stipulates that throughout 2019, pilot projects based on 5G will be developed resulting in the release of the second digital dividend. Hence, the groundwork is being laid so that in 2020 we will be able to browse the Internet on a smartphone at a speed that will reach 10 gigabytes per second. Data from Statista, a provider of market and consumer data, indicates that by 2024, 5G mobile network technology will have reached more than 40 percent of the global population, with close to 1.5 billion users.
This trend has appeared in all the lineups for a few years now, but everything indicates that this year will be the year it takes off definitively. This is the year we’ll see its democratization, while it is even included in the political agenda. At the beginning of December, the European Commission released a communication on AI directing the member states to define a national strategy addressing this topic by mid-2019.
In respect to the previous point robots, drones, and autonomous vehicles are some of the innovations in the category the consulting firm Gartner labels “Autonomous Things” defined as the use of artificial intelligence to automate functions that were previously performed by people. This trend goes further than mere automation using rigid programming models, because AI is now being implemented to develop advanced behavior, interacting in a more natural way with the environment and its users.
Blockchain technology is another topic that frequently appears on these end of year lists. It has now broken free from an exclusive association with the world of cryptocurrencies; its usefulness has been proven in other areas. In 2019 we will witness many blockchain projects get off the ground as they try to address challenges that still face the technology in different fields like banking and insurance. It will also be a decisive year for the roll-out of decentralized organizations that work with intelligent contracts.
This trend represents another stride for big data, by combining it with artificial intelligence. Using machine learning (automated learning), it will transform the development, sharing, and consumption of data analysis. It is anticipated that the capabilities of augmented analytics will soon be commonly adopted not only to work with data, but also to implement  in-house business applications related to human resources, finance, sales, marketing and customer support – all with the aim to optimize decisions by using deep data analysis.
A digital twin is a virtual replica of a real-world system or entity. Gartner predicts that there will be more than 20 billion sensors connected to end points by 2020, but the consulting firm goes on to point out that there will also be digital twins for thousands upon thousands of these solutions, with the express purpose of monitoring their behavior. Initially, organizations will implement these replicas, which will continue to be developed over time, improving their ability to compile and visualize the right data, make improvements, and respond effectively to business objectives.
Edge computing is a trend that relates most specifically to the Internet of  Things. It consists in placing intermediate points between connected objects. Data can be processed at these intermediate points, thus facilitating tasks that can be performed closer to where the data has been received, thus reducing traffic and latency when responses are sent. With this approach, processing is kept closer to the end point rather than having the data sent to a centralized server in the cloud. Still, instead of creating a totally new architecture, cloud computing and edge computing will be developed as complementary models with solutions in the cloud, administered as a centralized service that runs not only on centralized servers but also on distributed servers and in the edge devices themselves.
Chatbots integrated into different chat and voice assistance platforms are changing the way people interact with the digital world, just like virtual reality (VR), augmented reality (AR), and mixed reality (MR). The combination of these technologies will dramatically change our perception of the world that surrounds us by creating smart spaces where more immersive, interactive, and automated experiences can occur for a specific group of people or for defined industry cases. 
Digital ethics and privacy are topics that are receiving more and more attention from both private individuals as well as associations and government organizations. For good reason, people are increasingly concerned about how their personal data is being used by public and private sector organizations. Therefore, we conclude that the winning organizations will be those that proactively address these concerns and are able to earn their customers’ trust.