Running Toward the Edge
Exponential growth in bandwidth requirements and the explosion of IoT, combined with most organizations' constant pressure to contain operational budgets, is spurring a revolution in where intelligence resides in the network.
As the name implies, edge computing refers to the processing of data at the edge of a computer network -- such as the cloud -- and thus closer to the source of data. In essence, it pushes the frontier of computing to the logical extremes of a network.
"Intelligent edge" is the idea that tools normally associated with the core of networks are now being brought closer to the edge. This means tasks such as data processing, encryption, and analysis can happen closer to their sites of origin, instead of being transported to a data center in the network core. This enables analytics and knowledge generation to occur close to or at the source of the data.
Rather than using local servers to store, manage, and process data, cloud computing relies on a network of Internet-hosted servers. Computing and storage are further away from the network edge, forcing organizations to add increasing amounts of bandwidth; deal with spiking network costs; and manage complex security policies and latency.
Industry watchers predict that by 2020, 50 billion things will connect to the Internet of Things (IoT) -- or, as it's sometimes referred to, the Internet of Overwhelming Things. In today's world, data collected at the edge (locally), traverses the network to be stored and analyzed in a premises- or cloud-based data center, only to have the results go back over the network to where the people who need the data are... at the edge.
Telecom and media industries are facing growing distribution pressures from increased video needs and rapidly expanding bandwidth, combined with higher security and reliability requirements. Telecom service providers have begun using sophisticated compute and control systems to manage these requirements. Moving functions to intelligent edges allows them to make services more competitive.
Among the companies paving the way for intelligent edge services are Hewlett Packard Enterprise (HPE), Verizon, Microsoft, and GE Digital. These industry giants aren't simply embracing this vision, but rather reorganizing around it, putting their considerable weight and resources behind it.
HPE is betting on hybrid data centers and intelligent edge tools for IoT. Its strategy revolves around simplifying hybrid IT, using technology from its Aruba unit to add intelligence to campus and branch networks and launching edge computing products. HPE will offer professional and management services to complement these tools. The strategy is reportedly part of a company rethink called HPE Next.
Similarly, Verizon is heavily investing in its intelligent edge platform, built around fundamental virtualization concepts such as software-defined networking and edge computing, and is currently deploying 5G networks in several markets. According to Verizon, having an intelligent edge is changing how the company runs its network. By making software the control point for the network, it becomes easier to automate services and share different network services. It's essentially a multi-services edge platform that's actually software control... so it functions like a cloud at the core and at the edge.
Having compute and store capabilities at the edge reduces latency in a way that makes augmented reality and virtual reality possible. Convergence of the core network, coupled with the focus on the edge of the network, will likely give Verizon a significant cost advantage over companies deploying traditional services.
I recall reading that Microsoft predicts that within two years 45% of all data created by IoT will be stored, processed, analyzed, and acted upon close to the edge of the network. It holds this belief so firmly that it recently announced a reorganization aimed at emphasizing cloud services such as Azure Cloud and Office 365, and de-emphasizing Windows. Imagine that!
GE Digital is focused on developing products and services around an industrial distributed application platform, called Predix, optimized for high-volume, low-latency, and integration-intensive data management and analytics-driven outcomes. "Until recently, edge computing has been limited to collecting, aggregating, and forwarding data to the cloud. But what if instead of collecting data for transmission to the cloud, industrial companies could turn massive amounts of data into actionable intelligence, available right at the edge? Now they can," GE wrote in a company blog.
Among the primary drivers for edge computing are:
- Reduced latency -- Many real-time applications, including video and voice calling, require the lowest possible latency. For other mission-critical functions, compute must take place at the edge because latency is intolerable. Decreasing the distance between the application and data analysis functions and the calibration analysis reduces the risk that a network issue will cause instability or higher latencies.
- Lower bandwidth requirements -- Because some services will be handled at the edge, companies will no longer need bandwidth for moving large amounts of data to the core of the network and back. Thus, they'll see a reduction in their overall bandwidth requirements.
- Decreased costs -- As a result of reduced bandwidth requirements, overall network costs will drop while efficiencies improve. In addition, if all collected data is sent to the cloud, there may be duplication in compute functions, hardware, and networking equipment. Removing duplicate resources can further reduce associated capital and operating costs.
- Increased reliability -- Retries, drops, and missed connections can plague edge-to-data center communications by corrupting data. Reducing the distance data has to travel reduces the chance that something on the network might affect it.
- Reduced threats -- Data is more prone to attacks and breaches while in transit than when not in transit. Processing data at the edge can therefore reduce security vulnerabilities. In some cases, state or country laws govern remote transfer of data. Keeping data local limits liabilities related to these types of laws.
- Easier troubleshooting -- Segmenting devices on the network eases troubleshooting, allowing testing on local devices without impacting performance on the core or backbone.
Shifting compute and storage functions to the edge allows organizations to improve network efficiencies and reduce costs, making them more competitive in a rapidly changing world. If your organization isn't considering edge computing, you may be left in the dust by those that are.
"SCTC Perspectives" is written by members of the Society of Communications Technology Consultants, an international organization of independent information and communications technology professionals serving clients in all business sectors and government worldwide.