Autoscaling with KEDA
Design an event-driven autoscaling architecture using KEDA.
Objective
🔗
A challenge to help you set up KEDA and start using it to build real-life applications with event-driven autoscaling in Kubernetes. Learn how to configure KEDA as an autoscaling solution, enabling your workloads to scale based on external metrics such as message queue length, database size, or custom application metrics.
Prerequisites
🔗
Access to Meshery:
You will need access to a Meshery server, either Self-Hosted or the Meshery Playground.
Learn KEDA Concepts: Study KEDA’s key concepts, including ScaledObjects, ScaledJobs, and various scalers. The Linux Foundation course provides a comprehensive introduction to these concepts, which are essential for completing the challenge successfully.
Learn to use Meshery: Go through the Mastering Meshery Learning Path to gain proficiency in using Meshery for designing, deploying, and managing your KEDA-based architecture.
This competition will challenge you to design and implement event-driven autoscaling for Kubernetes applications using KEDA (Kubernetes Event-driven Autoscaling). You’ll learn how to set up KEDA, configure scalers, and optimize autoscaling for various event sources in a microservices architecture.
This competition will challenge you to design an architecture diagram that demonstrates how KEDA can be used to automatically scale applications in a Kubernetes environment based on event-driven metrics. You’ll learn how to configure KEDA scalers, how to integrate various event sources, and how to visualize the autoscaling process in a complex microservices architecture. This challenge will deepen your understanding of event-driven autoscaling in cloud-native environments and showcase how KEDA enhances Kubernetes’ native scaling capabilities