In the previous week I had worked on setting up the continuous integration and deployment pipeline for my project at CloudCV.
The first step was deciding upon the deployment schemes for the application. We are using Amazon Elastic Beanstalk (EBS) for deployment as it offers auto scaling and load balancing so that our applications stay afloat even during heavy load. We had the following options to choose from for deployment:
- Have separate EBS containers for the node JS frontend and the django backend applications.
- Using a single docker container with multiple processes handling both the applications inside a single Beanstalk environment.
- Using multiple docker containers handled by an EBS container inside the environment.
We decided it was best to go with dockerizing the applications for easier deployment with automated tooling services such as Travis CI. Also running multiple independent processes inside a docker container is not considered a good practice due to the architecture of docker. Hence we decided to go with the third scheme of deployment using a multi-docker platform with separate containers for different services. The one caveat with using multi-docker platform in EBS is that that the docker images cannot be built on deployment using a custom Dockerfile as is the case with single docker container. We can only pull images from a docker image registry. Hence, Amazon EC2 Container Registry is used to store the docker images. The docker images are pre built and pushed to a docker image registry before deployment. Then a file containing configuration data (Dockerrun.aws.json) is zipped and pushed to the beanstalk environment which triggers the deployment using the newly built images.
The next step was was writing dockerfiles and django settings for the production, staging and development environment. Also docker-compose configuration files for both dev and production environments was added so that we can run the fleet of docker containers locally and in non EBS environments also.
After this the task was to integrate the git repository with the the deployment stack i.e building the continuous deployment pipeline. Travis CI was chosen as it was already being used for linting and running unit tests and is easily customizable for running deploy scripts. The deploy scripts for both staging and production environments were written. These scripts build docker images for both containers using the Dockerfile and push them to EC2 Container Registry. Then they edit and zip the Dockerrun.aws.json file along with configurations options and push it to the specific environment which triggers a build in EBS. Thus whenever there was a push to the develop branch, if the tests pass successfully the script is run and the changes are deployed to the staging environment. Once a milestone is completed or a production release has to take place changes are merged from the develop branch to the master branch. On push Travis CI runs and if the tests pass successfully, the changes are deployed to the production environment, hence completing the pipeline.