How to periodically run a script inside a fresh Docker container on Amazon AWS?

I have a Docker image for a web app which needs a specific script to be run once a day. The web app is going to be deployed to a web tier in Elastic Beanstalk. What would be the best way to create a container for this image in order to run the script and then remove this container?

Ideally, I’d like to have some easy way to do the following every 24 hours: Create a worker environment in Elastic Beanstalk with a new Docker container, run the script and remove the worker environment. But this doesn’t seem possible.

  • Installing PostgreSQL within a docker container
  • What does it take to get the splunk logger running with docker-compose?
  • Error in Docker for Windows tutorial
  • Docker Service updatedAt timestamp
  • Jenkins: Pipeline Docker Job with Jenkins file - Docker groovy syntax
  • In docker, difference with commit and dockerfile
  • I’ve found a few services in AWS that might help (Simple Workflow, OpsWorks, Data Pipeline), but seem massively complicated for what I want.

    Any suggestions?

  • Facing ISsue while pulling images in Docker
  • Running etcd in Docker container
  • Proper way to monitor PostgreSQL running inside a docker container
  • Boot2Docker for OS X fails to start
  • Setting docker environment variable to existing environment variable with different name
  • Cluster of forward proxies
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.