Edge computing

Ameya Upalanchi
10 min readJan 8, 2022

published by B.Tech GRP - 11

Ameya Upalanchi (ETB-66)

Suyash Soni (ETB-60)

Shreenath Naikwade (ETB-11)

Rushikesh Chandak (ETA-19)

How edge computing will impact hardware

Edge computing hardware refers to the physical components and the surrounding services that are needed to run an application at the edge. These components include servers, processors, switches & routers, and the end device. To learn about other parts of the edge value chain, use our Edge Ecosystem Tool.

Processors

Processors are composed of CPU, GPU, and memory storage. The CPU determines the performance of an edge computing system; a higher number of CPU cores means that the system can handle more workloads and complete tasks at a higher speed. GPUs are used to accelerate hardware and allow for performance computing to occur at the edge. GPUs can also allow for edge computers to store, process, and analyze large volumes of data. More recently, processors are being optimized and purpose-built for edge and IoT, with built-in AI accelerators and 5G support.

Servers

Servers are the hardware that runs the compute at an edge location, within which a processor resides. Servers can be common-off-the-shelf or specialized (depending on the processor). Servers may be more or less suited for different use cases based on their specifications and location. These include CDN edge servers, network edge servers on on-premise edge servers. Find out more about edge servers in our article: What is an edge server?

Routers & switches

An edge router is a device that is deployed to act as a gateway between networks in addition to connecting local networks to the internet or a WAN. An edge switch (also known as an access node) is a component located at the meeting point of two separate networks and connects end-user local area networks to internet service provider networks.

How edge will impact hardware

Edge computing has a multitude of applications that operate in different conditions and locations. They require different hardware requirements depending on their use cases and industries. For example, for autonomous vehicles, it is necessary for real-time decision making for control of the vehicle, thus high-performance hardware is a priority given the large amounts of data being processed in real-time, however, due to limited space in the vehicle, hardware size is also a constraint.

Additionally, for industrial uses, edge computing hardware should be rugged and be able to withstand shocks, vibrations, extreme temperatures, and dust due to exposure to harsh environments.

To fulfill this, a fanless design could be used with a closed system, in which there are no vents required to be present to cool down the system and prevent dust and dirt from entering the computer and thus preventing damage. To prevent vibration damage, a “cableless design” could be used in which there is a lesser chance of a “loose connection” to cause a defect in the system and fewer moving parts. Data storage would be optimized by using solid-state drives (SSDs) — silicon chips — instead of hard drives (HDDs) — spinning disks — as they allow for faster data transfer and data storage. There is also less chance of data loss in accidental scenarios as fewer moving parts mean the system is less susceptible to damage from vibrations and shocks.

Due to a large amount of data being stored and processed in edge servers, they tend to heat up rapidly and therefore effective cooling systems are required. Air cooling is the most common system, however liquid cooling is also increasingly used in high-performance machines due to its greater heat capturing capacity. There are also initiatives to create sustainable and energy-efficient powering solutions for edge computing hardware. Over 40% of data center energy consumption goes to cooling systems, and there is a push for more efficient cooling systems and renewable power sources given the large power consumption of edge computing hardware. Find out more about reducing power consumption in our article: Edge computing — Changing the balance of energy in networks

Examples of edge hardware

There are many companies developing products and solutions in the edge hardware space, across the value chain, from processors to servers and supporting services such as power and cooling. Below is a small sample of some of the new and existing ecosystem players and their innovations in edge hardware. These companies have been taken from our 60 Edge Companies article.

Intel

Intel is an American multinational chipmaker that develops and manufactures primarily processors, but they also offer a range of vision processing units (VPUs), field-programmable gate arrays (FPGAs), networking & connectivity products in addition to a line of SSD products. Intel has a large commitment to edge through its recent acquisitions of edge hardware providers such as Habana Labs and Movidius. Additionally, the company is collaborating with other companies involved in the edge space, such as Red Hat, to create more innovative products such as a workload-optimized data node configuration for Red Hat Open Shift using Intel Xeon processors and Optane. Recently in a business reshuffle under the new CEO Pat Gelsinger, Intel further iterated its commitment to edge by creating a core business unit for edge computing. In the future, Intel’s Managing Director in India believes that edge computing will be “more and more prominent”.

Hewlett Packard Enterprise

Hewlett Packard Enterprise (HPE) is an American multinational technology company that offers a wide variety of edge solutions including edge services, edge security, and converged edge systems. It has specialized edge products, for example, hardware converged edge systems that are ruggedized to cater to a variety of harsh operating environments. Additionally, it provides standalone edge server blades — compact devices that distribute and manage data in a network. HPE is committed to the edge, with CEO Antonio Neri stating “the enterprise of the future is edge-centric, cloud-enabled and data-driven”, whilst creating an “intelligent edge practice” for the company. Furthermore, HPE seeks to have a “partnership first” attitude to achieve its “edge to cloud vision” through strong collaboration and innovation.

A Secure IoT Service Based on Cloud and Edge Computing

The IoT-Cloud combines the Internet of Things (IoT) and cloud computing, which not only enhances the IoT’s capability but also expands the scope of its applications. However, it exhibits significant security and efficiency problems that must be solved. Internal attacks account for a large fraction of the associated security problems, however, traditional security strategies are not capable of addressing these attacks effectively. Moreover, as repeated/similar service requirements become greater in number, the efficiency of IoT-Cloud services is seriously affected.

In this Blog, a novel architecture that integrates a trust evaluation mechanism and service template with balance dynamics based on cloud and edge computing is proposed to overcome these problems. In this architecture, the edge network and the edge platform are designed in such a way as to reduce resource consumption and ensure the extensibility of the trust evaluation mechanism, respectively. To improve the efficiency of IoT-Cloud services, the service parameter template is established in the cloud and the service parsing template is established in the edge platform.

The IoT is based on a very large number of objects/things that connect to the Internet to help humans perceive the world and improve their quality of life, However, there are many IoT characteristics, such as limited storage and processing capacity, that can reduce the service performance of the IoT Cloud computing can address these limitations associated with the IoT in terms of management, storage, computation, and processing. Moreover, cloud computing can create more services by integrating IoT resources. Due to these advantages, the concept of IoT-Cloud has been proposed. This concept combines the advantages of the IoT and cloud computing technologies to provide more and better services. However, there are still some security and efficiency problems with IoT-Cloud that must be solved

With the increase in the number of IoT-Cloud applications, increasingly more repeated/similar requirements are sent to the cloud. It is inefficient for an IoT-Cloud system to address these requirements one by one and constantly improving hardware performance is not a long-term solution. In addition, there are delay issues in IoT-Cloud services since the cloud is far away from users and the IoT.

To solve the above problems, an edge-based IoT-Cloud architecture with a trust evaluation mechanism and service template was established. Edge computing is performed at the Internet’s edge with a lot of computing and storage nodes, such as gateways, routers, mobile fog nodes, and edge servers, which are close to the underlying network. Edge computing also refers to cloudlets, micro data centers, and fog nodes, which have advantages in the quick response to cloud services. The edge computing layer in this architecture is divided into two main parts: The Edge network and The Edge platform.

The edge network is established on underlying edge nodes (move/static powerful nodes) and is parallel to the IoT.

The edge platform is composed of edge nodes (edge servers) that lie between the IoT and cloud, and this platform is a central hub of the IoT-Cloud service architecture.

The main contributions can be summarized as follows:

1) An edge network was adopted to move a large part of the trust evaluation mechanism out of the IoT. In the IoT, devices perform the direct trust calculation and send exceptions to the edge network. The edge network collects trust information from devices and analyzes this information for the entire trust state of the IoT.

2) Service templates were established in the cloud and on the edge platform. The service parameter template in the cloud stores the matching information while the service parsing template on the edge platform stores the matching information and parsing strategies.

3) The trust evaluation mechanism was integrated into IoT cloud services via edge computing. In the process of IoT service strategy establishment, the trust evaluation mechanism enables the edge platform to select trusted devices in order to generate or transfer data. Moreover, the edge network can monitor the IoT network load with balance dynamics and assist the edge platform in timely adjusting strategies.

Benefits of Edge Computing

1. Speed and Latency: speed is the most important core of the business.

In the digital factory, where intelligence-based technologies constantly monitor all parts of the manufacturing process to maintain data consistency, milliseconds are also important. edge computing has the ability to improve network performance via latency reduction. It can significantly reduce the latency if it processes data closer and reduces the physical distance to the source.

2. Security: Edge computing also aids businesses in overcoming local compliance and privacy restrictions, as well as data sovereignty concerns. The architecture of edge computing makes it easier to implement security protocol that will help seal off the compromised portion without shutting the entire network. With the help of edge data centers, they provide an additional security layer to protect against cyber threats.

3. Scalability: Computing, storage, and analytics capabilities are increasingly being packed into smaller devices that can be placed closer to end-users.IoT devices can be installed along with their processing and data management capabilities at the edge in single implantation.

4. Greater Reliability: Edge data centers allow them to provide efficient service to end users with minimal physical distance and latency. This is particularly useful for content providers who want to offer continuous streaming services.

5. Cost Savings: At the very least, data created at the edge must be temporarily held there. It must be stored again after being sent to the cloud, resulting in levels of redundancy. When redundant storage is reduced, redundant costs are reduced.

Edge Computing and Its Application

Edge computing is any type of computer program that delivers low latency nearer to the requests. It was originated in the late 1990s to serve web and video services, In the early 2000s, these networks evolved to host applications and application components at the edge servers, resulting in the first commercial edge computing services that hosted applications such as dealer locators, shopping carts, real-time data aggregators, and ad insertion engines.

So why it’s needed?

The world’s data is expected to grow 61% to 175 zettabytes by 2025. According to the research firm, Gartner, around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2025, the firm predicts that this figure will reach 75%. The increase of IoT devices at the edge of the network is producing a massive amount of data — storing and using all that data in cloud data centers pushes network bandwidth requirements to the limit. Despite the improvements of network technology, data centers cannot guarantee acceptable transfer rates and response times, which, however, often is a critical requirement for many applications. Furthermore, devices at the edge constantly consume data coming from the cloud, forcing companies to decentralize data storage and service provisioning, leveraging physical proximity to the end-user.

Applications

There are many applications in the area of edge computing such as:

  • Autonomous vehicles.
  • Remote monitoring of assets in the oil and gas industry.
  • Smart grid.
  • Predictive maintenance.
  • In-hospital patient monitoring.
  • Virtualized radio networks and 5G.
  • Cloud gaming.
  • Content delivery.
  • Traffic management.
  • Smart home.

--

--