Using docker, puppet and jenkins for continuous delivery and PROD deployment [closed]
Need to setup infrastructure for a new project. Previously i have used puppet standalone with jenkins, but now i’m thinking about incorporating docker builds, so that i could push from dev to stage’ing to production without triggering a build, but by simply fetching docker existing docker images that have been already built.
- Java web app with rest api backed by postgresql, neo4j, elasticsearch
- Client side app written with angular that talks to java through rest api
- Code stored in git repositories
- Dev server (building, dev + test environments) – 32GB linux machine
- Test server (AWS)
- Production (AWS)
So basically i was thinking something like this:
- Separate Docker images for java + cient side app, postgresql, elasticsearch, neo4j that talk to each other and have their data stored on hosts through Docker volumes, or by using Docker data containers (have not decided on the approach yet)
- Jenkins building all the code and creating Docker images that would be pushed to private internal repository
- Integration tests run with Puppet docker module on DEV server
- Push to production with jenkins via puppet by using Docker
Why should i use docker?
- Big dev machine – could easily run multiple instaces of my app without the need of virtualization (could have an unstable dev, stable dev, sit, etc.)
- Ease of deployment (use docker and puppet docker module) and rollback (simply retrieve the previous version from Docker repository)
- Quick migration and ability to spawn new instances
- Preparation for easy scaling of different parts of the system (eg. clustering elasticsearch)
- Does this look reasonable?
- I’m thinking about using this puppet module https://github.com/garethr/garethr-docker. How would update my environments via it? I must somehow stop the docker container, do an docker rm, and then docker run ?
- We’re using liquibase for database update management. Guess this should go separetly from docker for updates/rollbacks?
Any suggestions welcome, thank you.
One Solution collect form web for “Using docker, puppet and jenkins for continuous delivery and PROD deployment [closed]”
You’re building a container orchestrated PAAS. My advice to look at similar systems for best practices that might be worthwhile emulating.
First place to start is the 12 factor app site, written by one of the cofounders of Heroku. The site is incredible useful, describing some of the desirable operational features of a modern cloud scale application. Next stop would be Heroku itself to obtain an idea of what a “modern” development and deploy environment could/can look like.
I’d also recommend looking at some of the emerging open source PAAS platforms. Large vendor supported systems like Cloud Foundry and Openshift are all the rage at the moment, but simpler solutions (built on docker) are also emerging.
One of these, Deis,uses a related technology Chef, so might give some insight in how puppet could be used to manage your runtime docker containers. (Modern Deis no longer uses Chef)
- Yes this is quite reasonable.
- Instead of managing “environments”, do like Heroku does and just create a new application for each version of your application. This is the “Build, Release, Run” pattern. In your case Jenkins is triggered by the new code, creates the Docker images, which can be saved into a repository and used to deploy instances of your application release.
- Database would be an example of a “backing service” which you can connect to your application at application creation time. An upgrade would amount to stopping one application version and starting another both connected to the same database.