Using Kubernetes and other cloud native technologies to run edge computing is required to make the operational and business models work, but it is still in its infancy. Getting involved today puts you on the ground floor to shape this exciting future.
Edge computing is creating a new internet. In an age where consumers and businesses demand the smallest possible delay between asking a question and getting an answer, edge computing is the only way to reduce the time to insight. Edge computing shrinks this gap by lowering latency, dealing with data even when there is insufficient bandwidth, decreasing costs, and handling data sovereignty and compliance. While centralized cloud computing will persist, the radically different way in which we can create and act upon data at the edge of the network will and is creating novel markets and unlocking new value. By 2024, the edge computing market is expected to be worth over $9.0 billion with a compound annual growth rate of 30%.
However, to make these markets viable and fully unlock their potential will require taking into account the operational and business models that they require. While cloud computing has been able to rely upon centralization and economies of scale to construct the business model, edge computing needs a new paradigm. With hardware and software spread across hundreds or thousands of locations, the only feasible way to manage these distributed systems is through standardization and automation.
Managing distributed computing systems is not new to IT, precursing and some would say bringing about the internet, but the scale and complexity demanded by edge computing are novel. Beyond just the sheer number of locations, edge computing must also take into account harsh environments outside traditional antiseptic datacenters, remote or unreachable locations, spotty connections, dynamic provisioning, global data experience, and security risks. Beyond these technical challenges also lie the business ones. When examining the edge as a business, it quickly becomes clear that they need to be as close to zero-touch environments as possible because every truck roll can take a massive dent out of the margins.
Cloud Native and Kubernetes for the Edge
While cloud native technologies were born in the cloud, the operating and business paradigms they enable will make edge computing possible. Looking at the cloud native definition, we find that standardization, like immutable infrastructure and declarative APIs, combined with robust automation create manageable systems that require minimal toil. This standardization and automation are key to making edge computing both operationally and financially viable.
At the core of the cloud native ecosystem is Kubernetes. It was originally designed as a loosely coupled system with a declarative API and built-in reconciliation loops. These two features make Kubernetes perfectly suited for edge computing. First, it provides a standardized API to do the lifecycle management of hardware and software across disparate infrastructure and locations. Rather than having to redesign compute and applications for each use case or location, they can be designed once and deployed many times. This will allow businesses to easily scale around the world to meet their customers at their doorstep. Second, the reconciliation loops automate manual tasks to construct a zero touch environment with self-healing infrastructure and applications. Leveraging Kubernetes to provide standardization and automation of infrastructure and applications at the edge will allow companies to scale through software rather than people. This opens up new business models that were previously too expensive to be feasible.
5G, Edge Computing, and Kubernetes
One of the most hyped use cases for running software at the edge is 5G networks. 5G promises faster speed (both download and upload), reduced latency, higher device capacity, and efficient network slicing. All of these improvements have a massive potential to impact multiple business verticals and create whole new ones with IoT, AR/VR, autonomous driving, smart cities, and Industry 4.0 being some of the most commonly cited use cases. However, since 5G will be using a higher frequency than the previous generation networks, the connection range will be much shorter. Thus, 5G requires 5-10x more base stations. Using traditional management models and passing this increase in costs directly onto the end user or business is not feasible. To make 5G and the business models it creates possible will require standardization and automation of the edge.
Lead the Creation of the future of Cloud Native on the Edge
Kubernetes is already being discussed and deployed to make edge computing and 5G use cases possible. The Cloud Native Computing Foundation formed the Telco User Group (TUG) to consider and shape the strategy of introducing Kubernetes and other cloud native technologies into telco environments. In addition, the Common NFVI Telco Taskforce (CNTT) is currently writing a reference architecture for running Kubernetes in the Telco environment.
Using Kubernetes and other cloud native technologies to run edge computing is required to make the operational and business models work, but it is still in its infancy. Getting involved today puts you on the ground floor to shape this exciting future. The TUG is an open discussion forum that anyone can join and participate in the conversation. CNTT is looking for participants with experience running Kubernetes to add their knowledge to the Reference Architecture 2. At Loodse, working with our customers, we knew the edge needed automation and that’s why we created our open source tool, KubeOne. It handles complete life cycle management of Kubernetes clusters on the edge, allowing businesses to focus on their customers rather than their infrastructure. Issues and PRs are always welcome.