The future of cloud computing is near the edge. Edge computing brings compute and storage at the network perimeter, closer to usages. According to the ECCE (Edge Computing Consortium Europe) “The Edge Computing paradigm describes an approach to execute certain services closer to devices and thereby supplements centralized Cloud Computing solutions”. However, this new architecture approach brings with it infrastructure and network challenges. Included in this context of increased complexity are the IPAM repository and network functions such as DNS and application traffic routing.
By using data centers, either on premise or in the cloud, we have solved many IT problems. Concentrating resources in the data center is a smart move. It allows optimization, increases infrastructure resiliency and reduces energy consumption of all hardware thanks to virtualization. The movement towards cloud however highlights new challenges implied by regulation, new usages and security. These could push the enterprise to move back some data and workloads nearer to the usages and the users.
Moving content location to the edge
Bringing computation power and storage near the edge isn’t really an issue from a technological viewpoint. It requires the use of modern architectural templates used in data centers. It also needs to embrace service continuity constraints and data regulation when necessary. The first obvious feature we can think of with edge service is the Content Delivery Network. CDN brings storage for caching as well as intelligence to direct the user to its content. It can also optimize performance when storing static content in its local cache. While CDN brings a lot of value for content delivery, it is limited to cached objects. However, moving the location of these objects to the edge of the cloud reduces the response delay for the users and limits bandwidth consumption on the global network. This brings clear enhancement to the user experience.
Edge computation for IoT and cloud
More and more use cases are requiring computing power, not just caching of content. For example, all of your connected objects (IoT) may produce a large amount of data for analysis, but only a small portion of this data is really required centrally. The trending and alerting model (think digital meters) can be run at an intermediate level, enabling multiple levels of optimization. Using edge computation power in this case is valuable: data used solely for triggering alerts could be quickly discarded, and no bandwidth is consumed because only events will be sent back to the cloud. Users will be given access to only valuable events, which are far more interesting than just raw data. In addition, not all organizations have the capacity to store the huge volumes of data produced for a potential future usage.
Regulation in specific regions forces companies to store and process data within the same region (e.g. GDPR in Europe). Cloud approaches, especially public clouds, are not always adapted to answer location requirements. You may need a specific service which is offered by a cloud provider but only in another region. This requires data to be transported back and forth between services and regions, which is really complex to set up and mostly impossible to audit in a timely manner. Furthermore, cloud providers are not always forthcoming on the location of the data transiting between their services.
Why do we need edge computing?
Usages for edge computing are extremely interesting and various. More and more mobile objects are requiring computation but many may suffer from insufficient local CPU. Even if not all usages are business oriented, most will bring real benefit to their users or to their communities. We need to take good care of these new device families.
Data management and privacy
Sensors and captors are collecting more and more data, which in some cases cannot be sent to the cloud due to technical restraints. Using local computation for analysis could therefore be really beneficial. Imagine you need to analyze all the data collected by an airplane (engine, structure, weather conditions) during the time it is on the ground. Having to send the data for computation in the cloud would require very large bandwidth, as well as short transit delay, which is not always compatible with cloud (see long fat network). Another example would be medical and health data that may require storage in specific certified environments where the levels of control are high. Protecting personal health data may not be suitable for public cloud providers, nor for most hosting operators due to strict constraints.
Internet of Things
For sure there is a vast topic with IoT (Internet of Things). Even if the media sometimes misuse the terminology, usages are increasing. IoT covers a large variety of devices from the light bulb to the autonomous vehicle. Edge computing can bring benefit for storage of data for intermediate treatment, synchronous to asynchronous gateway, anonymization, event triggering or machine learning execution (e.g. for predictive maintenance). Some devices may require real-time computation power, like in AR/VR applications in the industry or maintenance operations. Other usages may also require video and audio processing with very short delays for continuous and smooth human interaction. Sending data for analysis to a central data center with 150ms transit delay is not compatible with human real time perception (<100ms). For example, take a look at the transit delays from your browser to Azure blog object storage with this tool.
Smart things and IP
We also have the domain of monitoring and alerting. Industries and utilities haven’t waited for the internet of things to make use of information collected from the field. The main usages include enhancing services, monitoring activities and billing customers. Industrial sensors are everywhere and the data or alerts they manage are really important in our day-to-day lives. Devices have moved to special radio networks (eg LoRa, Sigfox), some of which are using IP. We are generally talking about smart metering and smart cities with these smart things.
Another important subject covers tracking and inventory of assets. This can be in a large warehouse or in a harbour with containers where assets are constantly moving and decisions need to be taken by the handling vehicles or preparation robots. The amount of data linked to lidars and trackers mapping system needs to be handled quickly and is volatile enough to not be stored. In this space, not all devices are intelligent nor IP connected. We need some gateway between the IP world and the “thing world” (e.g. RFID tags). This kind of transformation demands proximity to the usage, making edge computing an ideal solution.
The area of farm monitoring uses sensors for which automation could be provided at the edge of the network. The wide variety of sensor uses includes animals with smart collars (mainly for localization and health monitoring), crop watering through sprinklers and autonomous vehicles for specific treatment.
All these devices require quick interaction with local computing resources as the cloud network could be far away. Most farms are not well covered by high speed network operators, so access to cloud resources may not be reactive enough for some usages. The amount of data required for inferring and acting can be really large and volatile, meaning that pushing this data to any cloud workload service is probably not optimal.
Weather monitoring can also take advantage of edge computing. By analyzing data from local sensors on the farm and global ones gathered from providers, quick decisions can be taken which are valid locally.
Vendors and solutions are moving to the edge
The list of edge computing needs is very long and usages are evolving with disruptive approaches that require IT technology to adapt quickly. Fortunately, new solutions are becoming available every day. We see three very different families of edge computing approaches:
- Cloud providers are proposing adaptation of their existing services or new ones to cover the edge: IoT data collection and asynchronous transport, function as a service (FaaS) at the edge for first line of calculation and processing of data, and even installation of part of their infrastructure in enterprise offices.
- Pure players offering to execute workload at the edge in multiple smaller data centers – they are generally coming from the CDN market, all propose CDN to complement their storage and compute offers.
- Software and open source proposals with mainly two frameworks: StarlingX from OpenStack and Edge from the Linux Foundation
To complement these three families, the new generation of mobile networks is proposing in its standards a way for providers and enterprises to host part of their computation power related to mobile devices at the edge of the network (see 5G Multi-access Edge Computing – MEC). This proposed solution is directly linked to NFV approaches that help virtualize the network.
But what’s the impact on infrastructure?
Moving part of the storage and compute power from central clouds to edge requires attention to the way IT systems are built. Obviously virtualization and technologies like containers will help move application chunks from one location to another with enough abstraction to limit the impact. Networks will have to adapt as well, multicast for service discovery, integration of repositories directly in the network, SDN, NFV and zero trust approach for security topics. Log and analytics systems will be mandatory in order to know what job has been processed where, not only for billing purposes but also for traceability.
One serious challenge with edge computing extension to cloud concerns management of the IP space. More precisely, the need to address IP inventory and routing, naming convention, resource inventory, device management, DNS and application traffic management. All these subjects today are considered as commodity by enterprises, but engineers will need to reconsider their views in order to effectively push storage and workload into totally virtual environments at the edge. Pivoting with an IPAM solution highly coupled to this ecosystem and with the engines like the DNS will ease the movement and allow continuity in the way IT systems are operated. Handling complexity requires good understanding, rigor, an accurate dynamic inventory and, without a doubt, smart automation.
- Successful IoT Projects: The Key Role of Automation
- What is IPAM and why is it important?
- Why SDDC is Powerless without Automated DDI Provisioning
Want to learn more about edge solutions?
Get in touch with us to learn more about cloud and edge DNS GSLB solutions from EfficientIP.CONTACT US