Edge computing is a term you’re likely to hear more and more, as many companies find they need a solution that works outside of data centers and major cloud services. It’s important to understand its basic purpose, processes, and influence on the world around us.
Edge computing was created out of a need for more efficient use of bandwidth, lower latency and better security. As our devices become more connected, it’s no longer feasible to send all the data they generate to a central server, especially when time is a factor — such as when an autonomous car needs to respond quickly to its surroundings.
Edge computing isn’t just about moving storage and compute power closer to the user; it’s also about enabling new types of applications and services that simply aren’t possible with cloud-only models. From smart homes and autonomous cars to augmented reality and industrial IoT, edge computing represents a fundamental shift in how we approach distributed computing problems.
The edge computing model is radically different than the centralized architecture that has dominated the IT world for decades. It’s a distributed processing model in which data is processed at the edge of the network, closer to the end user, rather than in a central location.
The idea behind edge computing is to bring data sources, processing and storage closer to areas where they’re needed most — such as at a remote location or in a mobile device. It’s not necessarily about replacing cloud computing but more about complementing it with services available locally.
Edge computing can be used in two ways: on-premises and off-premises:
On-premises edge computing relates to resources that are located on an organization’s premises. For example, when an employee works remotely, he or she needs access to important corporate data that may not be readily available outside of the office. To solve this problem, companies can deploy servers within their own offices, enabling employees to access data without having to go through the cloud.
Edge computing vs. Cloud computing
Traditionally, cloud computing has been used for storing and accessing data over the Internet with remote servers that are centralized. Applications like email, calendar, contact management, and others are currently running in the cloud, but they need an active Internet connection to function properly.
In edge computing, instead of relying on centralized servers in a data center or in a cloud environment, applications run on local devices and communicate with each other directly. Edge computing networks can operate autonomously even while they are offline and disconnected from a central location.
Why edge computing matters
Edge computing is a more recent method of computing where data is processed closer to the edge of the network, at the source of the data.
In many ways, edge computing is the natural evolution of cloud computing. Here’s what you need to know about it, and how it may affect your business.
The edge computing decision matrix
Edge computing is not a single technology, but a collection of technologies that bring computing and data storage closer to the source of the data. The decision about which technology to use for each application depends on balancing the requirements for performance, latency and cost.
Sensors, cameras and other devices generate data in real time, but must operate under stringent power constraints. Low power processors can offload some computing tasks from the cloud or enterprise data center by running machine learning algorithms locally. Local inference reduces latency and bandwidth requirements, allowing devices to operate in environments where they are not connected to the internet.
When devices are connected to the internet, it’s still important to have local processing power for short-term storage and preprocessing of data before sending information to an offsite server for further analysis. This approach reduces latency by avoiding unnecessary roundtrips between endpoints and servers. Local processing also reduces network traffic and associated costs.
Some companies rely on public cloud platforms for storing, processing and analyzing data from connected products at the edge. But many businesses prefer a mix of public cloud infrastructure with private servers hosted in their offices or colocation facilities. This combination lets them store sensitive business data onsite while taking advantage of elasticity offered by public cloud platforms for
Why does it matter?
In addition to enabling access when connectivity is limited or not available, edge computing has a number of benefits, including:
Lower latency: Moving data processing closer to where it’s being generated means less time for the data to travel. This reduces latency and improves performance.
Faster response times: Reducing latency can help applications respond much faster, especially for time-sensitive situations like monitoring for security threats or autonomous vehicles.
Less risk of losing data: In cases where data can only be sent occasionally due to connectivity limitations — such as in rural areas or during natural disasters — storing it at the edge can ensure that data isn’t lost.
The future of edge computing revolves around two main trends
First, the amount of data that needs processing by edge computing devices continues to grow at a dramatic pace. Second, more and more real-time applications require data analysis and decision making at the edge.
In the coming years, we will see new types of edge computing architectures emerge to meet these challenges. We’ll also see existing computer infrastructure adopt new hardware with enhanced capabilities, such as low-power chipsets in mobile devices, or graphics processing units (GPUs) in data centers.
Edge computing is a new way of reorganizing how our world operates, and data is going to be the forefront of it all. Not every company or technology has the capabilities to run their own edge computing, even if it does mean a shift in how we interact with computers and data. But those that do may prove to be industry leaders of the future.