No more .tar, .rar or whatever tarball that you manually install onto your server.
If you are a web developer, you probably know the pain of making your website work on your server. You would usually connect to it using ssh, clone your git build your app, or upload an archive with your build.
Let’s admit it. It’s painful and you are very prone to error when doing this. And what if for some reason your awesome features find a way to not work on your server ?
To fix it, we’ll take a look at how you can directly deploy your web app onto your server with a push on your repository. You can check the code for the article on my GitHub
I recently picked up React and Web Development, I used to not like them very much but my objective is to know a bit about it and how to make a git push transform into your website releasing. I wanted to automate the deployment of my apps and not worry about infrastructure, that’s the way I chose to fix the problem.
Github, GitLab, BitBucket and git in general
Let’s face it, you have to use git as a versioning system. It is powerful, allows for easy collaboration between people and teams, and let’s you use very efficient solutions such as GitHub. In this article I’ll use GitHub to manage our code, but bear in mind that the process is very similar with other solutions using Git.
If you have never used git as a versioning system, please take a look at tutorials or just read the doc.
For this tutorial I will deploy an image of a basic create-react-app image. You can deploy any image that will run an app, you’ll just have to adjust ports by yourself. So let’s go and create our app!
So we got our react app and we could just build it and manually upload it to our server and launch it. What if it’s a big app and we broke some code ? What if our server goes down ? Do we re-launch it manually ?
Let’s do something better, let’s use a CI to check if our tests pass. We’ll use circleCI, they offer a free plan with 2000 minutes of build, it will be more than enough. So go ahead and log in with your GitHub account at circleCI. Then head to the dashboard in the Project page and press build for your project.
If you look into the details, you’ll notice that the npm install step is taking some time, and it will take more time when we start adding libraries for our app. CircleCI allows you to cache some files or directory that will be re-imported on subsequent builds.
To do so we will create a circle.yml file at the root of our repository, it will allow us to set up settings that circleCI will follow.
Create a circle.yml file at the root of your repository and add the following content.
machine: node: version: 7.5.0 dependencies: override: - npm install cache_directories: - "./node_modules" test: override: - npm test
Precising the machine or the test is not mandatory, but I find it clearer to specify it: you can immediately know what will do circleCI just by looking at this file.
cache_directories: - "./node_modules"
This is the line that will say to circleCI to cache our node_modules, so that they don’t reinstall everything every time.
Push the file and look at the build run. Notice how npm install is way faster!
So far so good, we have automated our tests, and they are running at every push we do on our repository! But our coworker on windows built an awesome new feature, and it doesn’t work on your machine even though your coworker keeps repeating that “It works on my machine”. Let’s fix the problem by sharing a common architecture using Docker images and containers!
Docker is an awesome tech that allows you to build an Image of a system, and run it inside containers. If you want to know more about docker, dive into their doc!
Docker has an excellent doc, so follow it to install docker on your machine.
It’s time to write our Dockerfile to setup our project image! Many thanks to metakermit for the Dockerfile example I used.
FROM node:7 # Create app directory RUN mkdir -p /src/app WORKDIR /src/app # to make npm test run only once non-interactively ENV CI=true # Install app dependencies COPY package.json /src/app/ RUN npm install && \ npm install -g pushstate-server # Bundle app source COPY . /src/app # Build and optimize react app RUN npm run build EXPOSE 9000 # defined in package.json CMD [ "npm", "run", "start:prod" ]
Let’s go step by step!
It says docker to pull the latest image from the node official image repository. Our image will then build on top of it.
# Create app directory RUN mkdir -p /src/app WORKDIR /src/app # to make npm test run only once non-interactively ENV CI=true # Install app dependencies COPY package.json /src/app/ RUN npm install && \ npm install -g pushstate-server # Bundle app source COPY . /src/app
RUN commands tell docker to run the command when building the image, here we are setting up our app in /src/app and installing our libraries and installing pushstate-server to serve our app to the world.
# Build and optimize react app RUN npm run build EXPOSE 9000 # defined in package.json CMD [ "npm", "run", "start:prod" ]
We then build a production version of our app. EXPOSE will open port 9000 to the outside world and CMD defines command to run when starting our container with the image.
Add the following to your package.json file in the scripts part.
"start:prod": "pushstate-server build"
We can test locally that it works! Run the following command to build the image. Note that dockerID will be your username later when we push to docker cloud, for me its “kexo”. Change it accordingly!
$ docker build -t dockerID/pushtodeploy:latest .
It will build our image with name “dockerID/pushtodeploy” and tag “latest” using Dockerfile located in the current directory. The build will take a bit of time and you’ll see the following when the build is done.
So build our image successfully, it’s time to run it in a container, and we can also do it locally! We exposed port 9000 in our image so we’ll have to link it to a port. We’ll do it with the following command.
$ docker run -p 8080:9000 dockerID/pushtodeploy:latest
Your terminal should display the following:
npm info it worked if it ends with ok npm info using firstname.lastname@example.org npm info using email@example.com npm info lifecycle firstname.lastname@example.org~prestart:prod: email@example.com npm info lifecycle firstname.lastname@example.org~start:prod: email@example.com > firstname.lastname@example.org start:prod /src/app > pushstate-server build Listening on port 9000 (http://0.0.0.0:9000)
It says that the server is listening on port 9000, and as we mapped our port 8080 to the container port 9000 we can check our app at http://localhost:8080/
It’s working! Let’s stop our container and start deploying!
Remember your coworker ? Well now he can check if his feature works by running the container and checking that it responds. So can you. Docker allows you to run your app on the same architecture and avoid the “It’s working on my machine” situation.
Let’s integrate with Docker Hub
Now that we have an image, we can push it to a docker hub repository. If you haven’t done it already, create an account on docker hub to host our image.
To push your image just type:
$ docker login $ docker push dockerID/pushtodeploy:latest
As my dockerID is kexo, I will do:
$ docker push kexo/pushtodeploy:latest
You can check on Docker Hub, your image is now available. Note that it’s public so don’t push sensitive data. If you want your image to be private you can create one private repository on Docker Hub for free.
Remember our pipeline from the beginning ? We’re almost done. We have an Image in Docker Hub and we just need to auto deploy it! That may look like the hardest part, but in fact it’s the easiest because Docker Cloud will manage it for us!
It’s time for Auto Deploy using Docker Cloud
We have tests and a docker image, it’s time to automate the deployment of the image to docker hub the docker image repository.
We have a Docker ID, so we can log in to Docker Cloud. Head to the repository part and you’ll notice our Docker Hub repository.
You can see in the Tags part our latest image that we pushed earlier.
We will now link our Docker Cloud account with our AWS Account. Note that starting from here, if you are not under AWS Free tier you will have to pay for the server that Docker Cloud will run for us in AWS.
Go into the Cloud settings in Docker Cloud and link it to a provider. You can link it to any provider you want but I’ll use AWS in this article. Follow the doc to link a provider to the Docker Cloud.
Once it’s done, it’s time to start our first node cluster and then our first node. Head to the node cluster under Infrastructure in Docker Cloud and click on create. Select AWS as provider and t2.nano as Instance Type. You can check AWS instances pricing here. Careful, prices depend on region.
On the next page you should notice that it automatically selects our latest tag from our repository. Select AutoRedeploy in General Setting. You can tweak some settings if you want but default ones are ok. When we tested locally we mapped our port 8080 to 9000 in the container. That’s what we’ll do here. Go down to the Port section. Tick the published box: it means the port is open to the outside world. You can put a prices port or a dynamic one, input port 9000 for example.
Then create the service. The default settings will auto deploy on the emptiest node. We already have created an empty Node earlier so it will automatically deploy on it. When the service is running click on it and in the section Endpoints you can click on the service endpoint. The link will look like this
Careful on Firefox, you need to change tcp to http so that it can display the page.
We’re almost there. Now every time we push an image to our repository it will automatically deploy in our node! But we still have to run the command by hand to push the image, and we want it to be fully automated.
Building the image and deploy automatically
Now let’s update our circle.yml file so that it will build the image and push to our repository!
Update the file with the following one
machine: node: version: 7.5.0 dependencies: override: - npm install cache_directories: - "./node_modules" compile: override: - docker info - docker login -e $DOCKER_EMAIL -u $DOCKER_USER -p $DOCKER_PASS - docker build --rm=false -t kexo/pushtodeploy:latest . test: override: - docker run -d -p 8080:9000 kexo/pushtodeploy:latest; sleep 10 - curl --retry 10 --retry-delay 5 -v http://localhost:8080 - npm test deployment: hub: branch: master commands: - docker push kexo/pushtodeploy:latestk
As you can see, we added just a bit of docker commands. First we login to docker, then we build our image. At the end the deployment part we will push the image to docker hub only if we are on branch master. That allows you to dev on another branch and then merge or rebase to your release branch that will auto deploy to our server.
However for it to work we need to add environment variables to circleCI. You can do it by clicking on project settings in pushtodeploy, and go in environment variables. Then add the 3 variables.
When it is done, save settings and you can push your new circleci file and Dockerfile. You can follow the build on circleCI. It will take a bit of time because of the build of the Docker Image.
When it is done, check on Docker Cloud and you will see that a new tag has been pushed and it has been auto deployed to your node.
Check the service and you can see it auto redeployed too! Your new version is now live!
We created a pipeline allowing us to auto deploy to our server every time we push on our master branch. It will go through testing, building and deploying automatically! There are many other solutions to auto deploy your docker images onto your servers. You can check AWS integrated solutions to test build and deploy, CodeShip or TravisCI instead of CircleCI for running tests and building images.
Don’t forget to turn off the nodes and the instances on AWS if you don’t want to be charged for them.