Edge computing: How it works
Blog: The Enterprise Project - Enterprise Technology
Edge computing, a technology approach that enables data to be processed and analyzed closer to its source, is poised for growth. Analysts at Grand View Research analysts have predicted that the market for such solutions will grow from $3.5 billion to $43.4 billion by 2027.
Some enterprises are adopting edge capabilities to complement existing hybrid cloud strategies and better manage today’s ever-increasing volume of data, says David Williams, managing principal at digital consultancy AHEAD. For those not yet experimenting with edge approaches, it may be helpful to understand how edge computing works. Also, for IT business leaders who need to explain how it works to non-technical audiences, we’ve gathered some language and examples to use. Let’s dig in.
[ Get a shareable primer: How to explain edge computing in plain English.]
Edge computing definitions
“Edge computing refers to a set of enabling technologies that move data storage, computing, and networking closer to the point of data generation or consumption and away from a centralized computing location,” explains Yugal Joshi, vice president at management consultancy and research firm Everest Group.
“For edge devices to be smart, they need to process the data they collect, share timely insights and if applicable, take appropriate action. Edge computing is the science of having the edge devices do this without the need for the data to be transported to another server environment,” says Red Hat chief technology strategist E.G. Nadhan. “Put another way, edge computing brings the data and the compute closest to the point of interaction.”
Think of the network evolution toward edge computing as a move from highways moving data to and from a centralized core location to something like a spider’s web of interconnected nodes of storage and processing devices, advises Dr. James Stanger, chief technology evangelist at CompTIA.
“Edge computing is the practice of capturing, storing, processing, and analyzing data near the client, where the data is generated, instead of in a centralized data-processing warehouse,” Stanger says. “Hence, the data is stored at intermediate points at the ‘edge’ of the network rather than always at the central server or data center.”
Edge computing is fundamentally about increasing speed and efficiency, according to Todd Loeppke, chief architect of Sungard Availability Services. It can also ensure data security and optimize cloud investments, adds Joshi.
[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ]
What does edge computing do? Picture the before and after
A good understanding of how edge works can begin with what came before. Prior to edge computing, data was collected from distributed locations outside the traditional data center. That data center could be in-house, co-located at a partner facility, or in the public cloud. The data was then sent to the data center, where it would be processed: either a decision was made based on the data, or the value of the data was determined. With the advent of edge computing, decisions can be made at the collection point or at a location physically close to the collection point.
“This significantly improves the time required to make a decision based on the data, which is critical for many use cases that utilize real-time decisions, such as autonomous cars communicating with each other,” Loeppke says. “Additionally, edge computing is much more efficient and reduces the volume of network traffic since all the data does not need to be pushed back to the data center.”
More efficient and less costly processing at the edge has use cases across industries. “Processing at the edge rather than in the cloud translates into significant cost reductions along with swifter, more efficient processing for tasks such as inspection, quality assurance, and better safety measures,” says Orr Danon, CEO of Hailo. “In 2021, factories will utilize cameras with AI edge processing technology on the production floor, not only to monitor equipment and production and enforce safety measures, but also to ensure workers are properly distanced from one another.”
Similarly, retailers collecting visual or geospatial intelligence will require significant edge processing capabilities so that crucial data is processed on-premise without significant latency.
[ Want more edge computing examples? Read also: Edge computing: 5 examples of how enterprises are using it now. ]
Where do the cost savings come in? Amounts of data continue to increase from the ever-increasing number of devices, applications, and people who continuously need to connect, says Rosa Guntrip, senior principal marketing manager, cloud platforms, Red Hat. “If all data needs to go back to a central data center for processing, organizations could be faced with needing to scale up their data center infrastructure to meet rising demands, which impacts costs from both a CapEx and OpEx perspective. In addition, if all of that data needs to go back to a central site, organizations are also looking at the costs of backhauling data (i.e., cost of bandwidth).”
Edge and IoT: What’s the difference?
As organizations deploy an increasing number of Internet of Things (IoT) devices and connectivity grows more sophisticated, an edge computing ecosystem made up of billions of devices and servers is emerging.
Edge computing represents infrastructure and workloads deployed anywhere outside of the core closer to where data is generated and consumed, explains Dave McCarthy, research director within IDC’s worldwide infrastructure practice focusing on edge strategies. This could be a remote or branch office. It could also be industry-specific field locations like factories, warehouses, hospitals, and retail stores. It may refer to on-device computing or the use of a micro data center.
“Edge computing is usually precipitated by hardware that exists outside of the data center, which continuously gathers or creates significant amounts of data. That hardware could be a cell phone running an application that receives real-time advertising based on the location of the signal, a remote camera being used for image recognition, or a sensor making real-time decisions,” says Leoppke. “The data created or captured is only useful for a short amount of time so the processor or computer utilizing the data is either co-located with the device or close by (network latency-wise).”
Let’s dispel a popular misconception: Edge is not just another name for IoT. Notes Red Hat technology evangelist Gordon Haff, “IoT is an important edge computing use case. For example, in a three-tier IoT architecture, sensor data often feeds into some sort of local gateway. The gateway may use that data to take an action that needs to happen quickly, such as stopping a vehicle. It can also filter and aggregate the data before sending it back to a data center for analysis and tracking purposes, thereby saving network bandwidth.”
Edge computing isn’t just about IoT, though. An increasing number of other application areas, in telco and elsewhere, work best when services are pushed out closer to the humans and machines interacting with them.
How is edge computing implemented? Architecture options
When it comes to edge computing, enterprises can choose what edge architecture they need, says Joshi.
Most enterprise applications of edge computing take a distributed approach, according to McCarthy. “That means individual functions or modules within the application are deployed across a spectrum of on-premises, cloud, and edge infrastructure,” he says.
For enterprise applications, for example, developers can choose to run part of their code at edge locations. “This helps reduce network data movement and associated cost as well as time. They need to choose the architecture as well as relevant infrastructure to run these workloads,” Joshi says. “Many enterprise application vendors have invested in edge capabilities as they see it as an extension of ‘edge to cloud’ architecture and would want to have clients on their platforms for the entire chain.”
Is edge a part of hybrid cloud architecture? Yes. Let’s dig into that a bit deeper.
Edge vs. cloud computing: An extension of hybrid cloud architecture
Edge computing often provides an enterprise with an entirely new capability. But, Leoppke notes, it’s important to remember that this new capability will likely have a number of dependencies in order to operate as expected. “IT leaders need to communicate what those dependencies are to their business partners,” Leoppke says. “If the business has this high-level understanding of the architecture, they may be able to provide critical business insight that will assist the IT leaders/technologists in the creation of an optimal architecture.“
For example, a global data center is typically located based on access to network pipes and the cost of power, Leoppke says. “Edge computing is driven by the location of the data gathered or created, which may or may not have anything to do with access to network pipes and the cost of power.”
Some people may also assume that edge computing is an alternative to the cloud computing model, but in fact, they are complementary. “The two work together to overcome the limitations of any one deployment model,” says McCarthy, noting that it is possible to deploy cloud-native approaches in edge locations. A recent IDC survey found that 95 percent of new edge deployments will be based on cloud-native technology.
“Think of edge as an extension of hybrid architecture,” McCarthy advises. “Historically, hybrid was considered binary: some resources on-premises and some in the public cloud. The definition of hybrid is expanding to include on-premises, multiple public clouds, and a variety of edge locations.”