Containerisation: Harmonising workloads at the edge

Containerisation: Harmonising workloads at the edge

Cloud and centralised data centres have dominated the IT compute discourse over the past decade. These approaches leverage the economy of scale to significantly decrease the marginal cost of the system operation and administration, and lower the capital expenditure needed for scaling.

Recently, mobile computing and Internet of Things (IoT) applications have given rise to a decentralised computing approach. The edge computing paradigm serves these applications in terms of efficiency and economy. This means computing and storage resources are positioned at the edge, closer to the data source, sensors, and mobile devices. The advancement of blockchain-based solutions contributes to the edge computing movement. It offers a new way of exchanging valuable insights between intelligent edge devices without uncovering the underlying data.

Edge computing involves the deployment of resources at the edge, delivering highly responsive computing services for mobile applications, scales easily, and has privacy advantages for IoT applications. As the computing paradigm shifts towards the inclusion of edge servers, lightweight containerisation solutions are fast becoming the standard for application packaging and orchestration.

With containerisation technology, applications deployed on the edge can be deployed in the remote data centre and vice versa. Moving large amounts of on-premises data to the data centre may be costly and could cause delayed response due to the limited bandwidth of communication channels. For some industries, replicating on-premises data to the data centres may have regulatory constraints.

The design and deployment of edge-specific workloads must primarily address the following challenges:

  • The management complexity of distributed workloads
  • Increased security risks
  • The limitations of latency and bandwidth.

Containerisation solutions address these challenges effectively by:

  • Managing application deployment across various infrastructure types and any number of devices
  • Seamlessly and reliably deploying applications across distributed infrastructure
  • Remaining open, maintaining flexibility and easily adapting to evolving requirements
  • Implementing latest security best practices across hybrid workloads

The containerisation provides the means to harmonise workloads that help with modernisation and abstraction from the underlying infrastructure, allowing DevOps teams to approach the deployment at the edge with the same set of tools as they would traditionally in data centres and the cloud.

HPE Ezmeral – the enterprise containerisation solution

HPE Ezmeral Container Platform facilitates deployment and management of containerised enterprise applications at scale. Ezmeral supports the deployment of both cloud-native and non-cloud-native monolithic applications with persistent data. The prominent use cases for Ezmeral include machine learning, analytics, IoT/Edge, DevOps (CI/CD), and application modernisation.

Kubernetes is part of Ezmeral’s offering. Kubernetes has emerged as an open-source system for container orchestration, providing the fundamental building blocks for cloud-native architectures. The HPE Ezmeral Container Platform includes technical innovations from HPE following the acquisitions of BlueData and MapR, together with open-source Kubernetes for orchestration. BlueData has a proven track record of deploying non-cloud-native AI and Analytics applications in containers. At the same time, MapR brings a state-of-the-art file system and data fabric for persistent container storage. With the HPE Ezmeral Container Platform, users can extend container agility and efficiency benefits to more enterprise applications — regardless of where and how they run (bare-metal, ritualised infrastructure, on-premises, multiple public clouds, or edge).

Ezmeral is a 100% open-source, Kubernetes-based, turnkey solution that brings consistent processes and standard services to cloud-native and non-cloud native apps. The solution delivers improved agility, increased efficiency, and a cloud-like experience to non-cloud-native apps. Ezmeral offers greater parity for application developers working with monolithic, non-cloud-native apps.

The rapid evolution of edge computing

The edge computing concept detaches computing applications, data, and services from centralised data centres to the edge of the network. The main objective is to allow data processing services to be placed near the source of data. Edge computing is closely related to the IoT. While centralised, cloud computing provides a holistic view of the data and operations, the edge is responsible for localised views.

Edge computing environments are resource-limited and can only tolerate lightweight and simplified software runtimes. The performance measure of any edge solution considers deployment time, responsiveness, scalability, and flexibility. Containerisation has emerged as the solution to easily package, deploy and orchestrate edge applications that satisfy these stringent performance requirements.

Containerised applications can flexibly perform edge workloads, allowing easy upgrade and continuous deployment capabilities. This is particularly important when it comes to evolving security vulnerabilities. Using containerisation technologies and container orchestration will enable developers to swiftly update and deploy atomic security updates or new features, without affecting the day-to-day systems of IoT and edge solutions.

The rapid evolution of edge computing has resulted in the creation of systems such as KubeEdge. The system extends native containerised application orchestration capabilities to hosts at the edge. KubeEdge was built upon Kubernetes and provides fundamental infrastructure support for network, app deployment and metadata synchronisation between cloud and edge. Developments such as KubeEdge indicate the importance edge computing is going to play in the computing landscape.

Final thoughts

Edge computing addresses challenges associated with latency-sensitive and real-time applications such as autonomous driving, AR/VR, industrial automation, and video processing. The essence of the solution is founded in the migration of those applications from distant data centres to the edge of the network.

Businesses seek reliable software solutions that allows easy and efficient management of edge infrastructure. Containerisation solutions accelerate migration of existing applications to the edge and the deployment of new and dedicated ones. The main advantages lie in easy scalability and the ability to leverage existing applications, toolchains, and developers’ expertise.

Containerisation isn’t limited to edge applications and large enterprises. Small and medium enterprises can benefit from HPE containerisation solutions and applications, such as app modernisation and the adoption of DevOps for increased productivity.

HPE offers a suite of containerisation products and solutions that can help organisations stand out and build a solid foundation for the future. Any business navigating edge computing should consult an experienced IT partner. Contact us to assist as you build a framework and begin to take advantage of the advances at the edge.