CloudContributory Expert VoicesDevelopersDevOpsSecurity

Accelerating to the Edge with Kubernetes: Opportunities and Challenges

security
0

Organizations have accelerated transformation and IT modernization initiatives by months – or even years – in response to the rapid expansion of digital business. With more distributed environments and teams that span entire global enterprises, organizations are pushing capabilities to the edge to lower latency, reduce traffic bottlenecks, and find easier ways to exchange and manage data and branch out from centralized clouds or on-premises data centers. Unfortunately, these distributed environments also mean new security challenges, as the edge creates a wealth of new vulnerable endpoints such as mobile devices, IoT sensors, and data centers that must be secured. According to Ponemon’s Digital Transformation and Cyber Risk report, 82% of IT security and C-level respondents said they experienced at least one data breach because of digital transformation.

To solve these challenges, a reliable core with failure modes and effective data protection capabilities is critical, as edge network operators deal with the gauntlet of risks and infiltration points like malicious hardware/software injections, physical tampering, DDoS, and routing attacks. As these deployments mature and IT policies are created to catch up to new challenges, the countermeasures are becoming more effective in stopping outside threats.

Security Fosters Opportunity

The interest in expanding the edge for various applications and environments is apparent. Research firm Omedia found that over half of the organizations surveyed are currently deploying Kubernetes for edge workloads, a key factor in enabling a stronger core that business can build their edge infrastructure strategies around. This also aligns with a desire to modernize and to choose the best infrastructure that enables the hybrid models of both on-premises and cloud-hosted configurations.

Even smaller scale decentralization requires improvements for how containers and the edge work in tandem. Delivering secure data management capabilities like hardened backup and disaster recovery are critical for limiting security lapses and operational downtime while maintaining developer workflows. Accelerating data protection is important to ensure data remains protected in highly diverse, operational landscapes.

Kubernetes can also serve as the deployment environment for DevOps CI/CD pipelines, making it easy for developers to roll out continuous updates to edge applications. This will be possible by providing a universal control plane that can work with any type of underlying edge infrastructure to simplify the deployment and management of workloads across diverse edge environments. Maximizing the benefits of Kubernetes for edge workloads means striking the best balance of traffic and minimizing latency.

Challenges at the Edge

Organizations considering edge investments have the choice to use Kubernetes as their orchestrator, alongside containerized microservices and API-based automation to build comprehensive edge-to-cloud solutions that can integrate with development, management, and security tools. But this will inevitably bring challenges as enterprises pursue digital transformation efforts to optimize cloud and edge workflows.

Optimizing data transfer will be key to ensuring low-latency data transmission between central data centers and edge locations. To improve internal data movement, operators must use their best judgement to determine which internal traffic to prioritize and how to balance incoming traffic from external endpoints. Moving data quickly is the greatest challenge in edge computing, with application orchestration a secondary issue. For improved interoperability, developers building edge computing platforms need to make it easier to deploy Kubernetes in conjunction with data fabric solutions.

Edge environments bring their own security and reliability risks, so there’s a greater need to find trusted purpose-built solutions to ensure that your data is safe at the edge. These will include solutions that specialize in edge intrusion detection, policy-based mechanisms, and physical security at edge nodes. Stronger controls over workload placement will be crucial, and admins must be granted the ability to assign applications to individual nodes, and see which works well when all the nodes are running within a single data center.

Addressing various user environments is another pain point to overcome, especially when determining key use cases for specific environments. While most Kubernetes vendors support multi-cluster management, managing workloads across multiple clusters is still a secondary consideration for Kubernetes developers. Lastly, think beyond one size fits all, and accept that not all applications and environments will be ideal for specific edge container environments, and what you can realistically manage in certain types of workloads.

Determining when, where, why, and how to implement edge-enabled transformation can be tricky. Not every digital transformation initiative will necessarily benefit from edge capabilities, but there’s certainly a growing interest in extending support for Kubernetes and other container workflows to improve security, interoperability, and more effective management of data.  By addressing the myriad performance and security challenges created by edge workflows, developers and operators can make the most of the benefits in deploying and managing data on the edge with Kubernetes as a core ecosystem.


Author: Vaibhav Kamra, CTO, Kasten by Veeam
Bio: Vaibhav Kamra is the Chief Technical Officer at Kasten by Veeam, which is tackling Day 2 data management challenges to help enterprises confidently run applications on Kubernetes. Previously, Vaibhav has worked at Dell EMC, Maginatics and Microsoft, where his focus has been on storage, filesystems and databases. He is also one of the contributors to Kanister — an open source framework for application-level data management on Kubernetes and the Data Protection Working Group in Kubernetes.

To hear more about cloud native topics, join the Cloud Native Computing Foundation and cloud native community at KubeCon+CloudNativeCon North America 2021 – October 11-15, 2021

Don't miss out great stories, subscribe to our newsletter.

Login/Sign up