Edge Computing And Cloud Computing Aren’t In Opposition

Edge Computer And Cloud Computing Not In CompetitionIt is estimated that by 2020, enterprises and governments will have deployed 5.6 billion IoT devices generating a staggering 507.5 billion zettabytes of data. Today, industries including healthcare, retail, logistics, and more are building networks of sensor and compute equipped smart devices. The more data businesses can collect, the better they are able to understand the world.

As the Internet of Things grows and we become more reliant on real-time information, data processing and analytics will move from centralized data centers to the network’s edge. Some have construed the migration of compute to the edge as harmful to the cloud and to the data center industry, but in reality, both edge computing and centralized cloud computing are vital to the future of the information economy.

A deeper understanding of the world leads to more efficient and effective planning and the ability to respond quickly to changes in the environment. But realizing the value of IoT data requires the ability to react to it in real time. Applications like traffic control and logistics can’t wait on data being transmitted to distant data centers to be analyzed and returned, hence the importance of edge computing.

Apple and Google provide an excellent example of the difference between edge computing and centralized cloud computing. Google is famed for the data centers and the technology it has developed to handle the fast analysis and indexing of massive volumes of data. When you interact with a Google service or use a Google device, your data goes via the cloud.

Apple does, of course, have centralized data centers, but in recent years Apple has tried to keep as much processing on device as possible. Both Google Photos and Apple’s Photos apps offer face and object recognition. Google uploads data to the cloud and does the work there. Apple uses machine learning technologies to analyze photos on-device.

As machine learning technology advances and processors with integrated machine learning hardware become more common, the scope of the computing that can be carried out at the network’s edge will increase.

So where does that leave the cloud and centralized data centers? Edge computing is great for real-time analysis and for filtering data to reduce its volume. But many of the insights provided by the IoT can’t be developed on the edge by smart devices working alone or in concert with a few other devices on a local mesh network.

To be truly valuable, that data must eventually be aggregated and analyzed collectively. A smart device could tell you what the weather is today, but it takes a data center with sophisticated modeling and analytics working on a huge aggregation of data to predict future weather with any accuracy.

While workloads that would of necessity have been based in data centers will move out to the edge as the technology develops, edge computing won’t take the place of the cloud — indeed, the network’s edge is just the cloud’s edge, and data centers will remain vital to the storage, analysis, and delivery of data to end users.