Docker and git deployment workflow confusion
I have been reading about docker, but im confused on how a nice workflow can be maintained for deploying an application hosted on github, to be deployed into a docker container on a cloud service. For example, a web application written for nodeJS, to be hosted on EC2. I develope the app on my local computer, and check it into github. To manage deployment of this, what is a typical way of doing this?
Check my dockerfile into github. My dockerfile tells docker to pull my application from a different github repository when the container is run. When my EC2 instance boots, get it to checkout my dockerfile and run it, resulting in a deployment of my application? Perhaps typically there is no need to check in the dockerfile and just deploy is straight? What happens when I make code changes, do I have to relaunch an EC2 instance?
- Why is apt-get update failing inside of a docker container?
- what method docker use to give container an ip address?
- gitlab-ci runner's Docker Executor | How can I see docker container where the build is going on?
- How can I set a static IP address in a Docker container?
- Docker - how to automatically POST a request on a REST container after launch?
Rely completley on docker hub? Create my application in a docker container on my local box. When im ready for deployment, commit the container to docker hub, then get the EC2 instance to just check out that image and run it? I dont like the idea of having to pay for hosting my image as a private repo. i dont really want my whole docker container to be public.
Check in my dockerFile side by side with my application repository stored in github? When an EC2 instance is boots. Im not quite sure what the workflow of deployment and update of the application would look like in this scenario?
Any comments, examples welcome :). Thanks!