Note: This is an excerpt from “Kubernetes at the Edge: Container Orchestration at Scale,” a new ebook from The New Stack.

During my 30-odd years working in software, I’ve seen five major shifts in how enterprise software is built and deployed: the adoption of higher-level languages like Java and C#; the agile movement; public cloud; DevOps, and particularly continuous delivery; and microservices.

Each ultimately succeeded because it reduced time to value. It enabled someone in a business to go from having an idea to getting it in front of a customer faster and at a lower cost. This combination allows businesses to try more ideas and use the market’s response to determine which ones resonate and which don’t. For all the hype and excitement, it remains to be seen whether generative AI (GenAI) will be a sixth.

In many ways, the rise of containerization at the edge mirrors this historical pattern. Reusing practices that the cloud enables, such as CI/CD, supports a more developer-centric framework, even in distributed physical environments. By enabling applications to be packaged in universal formats that operate identically across development, testing and production environments, containerization simplifies otherwise complex deployment processes.

As we continue to explore the role of AI in the enterprise, the edge represents the next frontier in this evolution. Edge devices are getting more capable. Real-time inference, data locality requirements and latency constraints all push compute closer to where data is generated and decisions are made.

The Edge of Disaster

The edge isn’t simply “cloud computing, but closer.” Rather, it extends cloud native practices to locations where traditional cloud assumptions break down. The edge is a uniquely challenging environment that requires thoughtful attention to these factors:

Limited or no IT staff: If something goes wrong with the infrastructure in an edge environment, you may not have anyone locally with the expertise to deal with the issues.
Physical constraints: Devices deployed to the edge are limited in CPU/GPU power and memory. When compared to a Tier 1 data center, power is often unreliable, and the environment might be unusually hot, cold or dirty, which can increase fail rates.
Security: In a data center, physical access can be easily controlled and limited; that simply isn’t true in an edge environment. This means you must pay careful attention to who has access to the devices on the network’s edge.
Connectivity issues and outages: Connectivity may be limited, so if you have to rely on the internet, you may not be able to run your applications.
Unpredictable demands: In more mature enterprises, you are likely to find legacy applications that have been running on premises for decades now running alongside modern, cloud native, containerized applications.

The organizations that master cloud native practices at the edge will be best positioned to harness real-time insights, respond to local conditions and deliver capabilities and experiences that were previously impossible.

Learn More

Our new ebook, “Kubernetes at the Edge: Container Orchestration at Scale,” is a comprehensive guide to the strategies, best practices and vendor options for deploying containerized applications in edge environments. By reading this free ebook, you’ll learn:

The true meaning of edge computing and how it spans from regional data centers to individual sensors.
Actionable best practices for building a robust and resilient edge infrastructure.
Real-world use cases showing how companies are leveraging the edge for AI, sustainability and more.
A market survey of leading Kubernetes edge vendors to help you choose the right platform.
The “how-to” blueprint for assembling your team and deploying Kubernetes at the edge without disrupting your current operations.

YOUTUBE.COM/THENEWSTACK

Tech moves fast, don’t miss an episode. Subscribe to our YouTube
channel to stream all our podcasts, interviews, demos, and more.

SUBSCRIBE

Group
Created with Sketch.

Charles Humble is a former software engineer, architect and CTO who has worked as a senior leader and executive of both technology and content groups. He was InfoQ’s editor-in-chief from 2014-2020, and was chief editor for Container Solutions from 2020-2023….

Read more from Charles Humble