Docker and git deployment workflow confusion
I have been reading about docker, but im confused on how a nice workflow can be maintained for deploying an application hosted on github, to be deployed into a docker container on a cloud service. For example, a web application written for nodeJS, to be hosted on EC2. I develope the app on my local computer, and check it into github. To manage deployment of this, what is a typical way of doing this?
Check my dockerfile into github. My dockerfile tells docker to pull my application from a different github repository when the container is run. When my EC2 instance boots, get it to checkout my dockerfile and run it, resulting in a deployment of my application? Perhaps typically there is no need to check in the dockerfile and just deploy is straight? What happens when I make code changes, do I have to relaunch an EC2 instance?
- How to create User/Database in script for Docker Postgres
- docker base image with solaris operating system
- Conditional fig configuration for dev needs
- How can I expose a Docker container port only to localhost so that it is also accessible via an ssh tunnel?
- Start play framework app from dockerfile
- Deploying multiple Deis clusters
Rely completley on docker hub? Create my application in a docker container on my local box. When im ready for deployment, commit the container to docker hub, then get the EC2 instance to just check out that image and run it? I dont like the idea of having to pay for hosting my image as a private repo. i dont really want my whole docker container to be public.
Check in my dockerFile side by side with my application repository stored in github? When an EC2 instance is boots. Im not quite sure what the workflow of deployment and update of the application would look like in this scenario?
Any comments, examples welcome :). Thanks!