Canary Deployment on AWS: Step by Step

What Is Canary Deployment? 

Canary deployment is a strategy used in the software development process to reduce the risk associated with new releases. Named after the “canary in a coal mine” principle, this strategy involves rolling out changes to a small subset of users (the canary) before applying them to the entire infrastructure (the coal mine workers). This way, any potential issues can be detected and resolved without impacting the overall user experience.

This method is particularly beneficial in cloud environments where applications are deployed across multiple servers. By introducing new features or updates to a small group first, developers can monitor how the system behaves under the new changes. If any anomalies or issues are detected, the new release can be rolled back without affecting the entire user base.

Canary deployment not only minimizes risk but also provides a feedback loop in continuous integration and continuous delivery (CI/CD) pipelines. It allows developers to get feedback quickly from live production environments, make necessary adjustments, and thereby ensure the resilience of the application.

This is part of a series of articles about software deployment

Benefits of Canary Deployment in AWS 

Amazon Web Services (AWS), the world’s leading cloud provider, offers an efficient platform for implementing canary deployment. The benefits of canary deployments in AWS extend beyond just risk mitigation. It also plays a crucial role in performance monitoring, facilitating CI/CD, and providing scalability and flexibility.

Risk Mitigation

One of the primary purposes of canary deployment in AWS is risk mitigation. Deploying a new release to a small group of users first allows developers to identify and fix any potential problems before they affect the entire user base. This ensures that the user experience remains unaffected, and any potential damage is minimized.

Performance Monitoring

Canary deployment in AWS enables efficient performance monitoring. AWS provides several tools to monitor the impact of a new release on system performance. This includes keeping track of response times, error rates, and other key performance indicators. By doing so, it allows developers to identify any performance-related issues quickly and take corrective actions.

Facilitating CI/CD

Continuous Integration and Continuous Delivery (CI/CD) are integral to modern software development practices. Canary deployment in AWS facilitates this process by allowing developers to release changes in smaller, manageable chunks. This enables faster feedback, easier bug tracking, and ultimately, more frequent and reliable deliveries.

Scalability and Flexibility

With AWS, canary deployment is highly scalable. Depending on user feedback and system performance, developers can adjust the size of the canary group or the duration of the canary phase. This flexibility allows teams to adapt to changing requirements and perform canary tests finely tuned to the needs of their operating environment.

Related content: Read our guide to AWS blue green deployment

Tools and Services for Canary Deployment in AWS 

AWS offers a range of tools and services that support canary deployment. These include AWS CodeDeploy, AWS Lambda, Amazon API Gateway, AWS App Mesh, and Elastic Load Balancing.

AWS CodeDeploy

AWS CodeDeploy is a service that automates application deployments to various compute services such as Amazon EC2, AWS Fargate, and AWS Lambda. It helps businesses achieve rapid, reliable, and consistent deployments, thereby reducing overall downtime.

AWS Lambda

AWS Lambda is a serverless compute service that lets you run your code without provisioning or managing servers. It can be combined with AWS CodeDeploy to implement automated canary deployments efficiently.

Amazon API Gateway

Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. It supports several types of deployments, including canary release deployments.

AWS App Mesh

AWS App Mesh is a service mesh that makes it easy to monitor and control microservices. It provides consistent visibility and network traffic controls for every service in an application.

Elastic Load Balancing

Elastic Load Balancing automatically distributes incoming application traffic across multiple targets, such as Amazon EC2 instances, containers, IP addresses, and Lambda functions. It can effectively be used to steer a portion of the traffic to the new version during a canary deployment.

The Process of Canary Deployments in AWS 

Create Deployment Environment

The first step in the process of canary deployment in AWS is creating the deployment environment. This includes setting up the infrastructure necessary for the deployment, such as servers, databases, and other resources.

When creating the deployment environment, it’s important to ensure that it mirrors your production environment as closely as possible. This includes not only the infrastructure but also the configuration and the data. By doing this, you can ensure that you’re testing your updates in an environment that accurately reflects the conditions under which they’ll be running once they’re deployed to all of your users.

Creating a deployment environment in AWS is easy thanks to services like Amazon Elastic Beanstalk, which provides pre-configured environments for various programming languages and frameworks. You only need to upload your application code, and the service takes care of the rest, from capacity provisioning and load balancing to automatic scaling and application health monitoring.

Set Up Load Balancing

The next step in the canary deployment process is setting up load balancing. Load balancing is a technique used to distribute incoming network traffic across multiple servers to ensure that no single server becomes overwhelmed. This is particularly important in canary deployment because you’ll be routing a portion of your traffic to a new version of your application.

AWS offers a built-in load balancing feature through its Elastic Load Balancing service. This service automatically distributes incoming application traffic across multiple targets, such as Amazon EC2 instances, containers, and IP addresses, in one or more Availability Zones. It ensures that only healthy targets receive traffic by conducting regular health checks on them.

Setting up load balancing in AWS involves creating a load balancer, registering targets (i.e., the servers that will receive the traffic), and configuring listeners, which are responsible for checking for connection requests from clients.

Build the New Version

Once your deployment environment is set up and your load balancer is in place, it’s time to build the new version of your application. This involves writing new code or modifying the existing one, testing it thoroughly to ensure that it works as expected, and packaging it for deployment.

Building a new version of your application requires careful planning and execution. You need to ensure that your new features or updates don’t break existing functionality, and that they provide value to your users. It’s also important to keep the size of your updates manageable; smaller, incremental updates are easier to test and deploy than large, monolithic ones.

AWS provides various tools and services that can help you build your new version. For example, AWS CodeBuild is a fully managed build service that compiles your source code, runs unit tests, and produces artifacts that are ready to deploy.

Initiate Canary Deployment

With your new version built and ready to go, it’s time to initiate the canary deployment. This involves deploying your new version to a small subset of your users and monitoring their experience.

When initiating a canary deployment in AWS, you can use AWS CodeDeploy. CodeDeploy introduces the new version gradually, allowing you to track its performance and user feedback before deploying it to all of your users.

When selecting the subset of users for your canary deployment, it’s important to choose a representative sample. This could be a group of users who are particularly tech-savvy and tolerant of bugs, or it could be a random selection of your user base.

Route Traffic

After initiating the canary deployment, the next step is to route traffic to your new version. This involves redirecting a portion of your user traffic from your current version to the new one. In AWS, you can use Amazon Route 53, a highly available and scalable cloud Domain Name System (DNS) web service. 

Routing traffic in canary deployment is done gradually. At first, only a small percentage of traffic is directed to the new version. As you gain confidence in the new version’s performance and stability, you can gradually increase this percentage until all traffic is being served by the new version.

Monitor Performance

The final step in the canary deployment process is monitoring the performance of your new version. This involves keeping a close eye on key performance indicators (KPIs) such as error rates, response times, and resource usage. You also need to monitor user feedback to identify any issues or areas for improvement.

Monitoring performance is crucial in canary deployment because it allows you to catch and fix issues before they affect your entire user base. If you notice any problems, you can roll back the deployment, fix the issues, and try again.

AWS provides several tools for monitoring performance, such as Amazon CloudWatch, which collects and tracks metrics, collects and monitors log files, and responds to system-wide performance changes. With these tools, you can set up alarms to alert you when certain thresholds are exceeded, allowing you to respond quickly to potential issues.

Infrastructure as Code with Codefresh CI/CD

Codefresh is built for modern tools with support for flexible frameworks. Most infrastructure as code tools are available as docker images and can be seamlessly integrated into Codefresh pipelines – this happens to be a very common pattern for many of our customers. Learn more about how you can easily execute a custom freestyle step with any of these images here.


If you are interested in managing Codefresh resources with Terraform, we also have you covered there! The Codefresh Terraform provider can manage the creation, updates, and removal of Codefresh resources allowing you to utilize your current infrastructure as code workflows without compromises. 

Learn more about Codefresh

How useful was this post?

Click on a star to rate it!

Average rating 4.6 / 5. Vote count: 18

No votes so far! Be the first to rate this post.