Is it possible to directly call docker run from AWS lambda

I have a Java standalone application which I have dockerized. I want to run this docker everytime an object is put into S3 storage. On way is to do it via AWS batch which I am trying to avoid.

Is there a direct and easy way to call docker run from a lambda?

  • .ebextensions with Docker on elasticbeanstalk
  • Deploy image to AWS Elastic Beanstalk from private Docker repo
  • How to deploy django 1.8 on Elastic Beanstalk using Docker
  • Strange behavior of celery workers in ecs docker container
  • Elastic Beanstalk high CPU load after a week of running
  • Discourse forum output
  • What is the difference between running an application on ec2-server and running an application on top of docker on ec2-server?
  • haproxy in docker container
  • Amazons EC2 container service - are tasks deployed as docker containers inside docker containers?
  • Install node in Dockerfile?
  • Akka Cluster with bind-port and bind-hostname
  • Meteor build locally or on aws host
  • One Solution collect form web for “Is it possible to directly call docker run from AWS lambda”

    Yes and no.

    What you can’t do is execute docker run to run a container within the context of the Lambda call. But you can trigger a task on ECS to be executed. For this to work, you need to have a cluster set up on ECS, which means you need to pay for at least one EC2 instance. Because of that, it might be better to not use Docker, but I know too little about your application to judge that.

    There are a lot of articles out there how to connect S3, Lambda and ECS. Here is a pretty in-depth article by Amazon that you might be interested in:

    https://aws.amazon.com/blogs/compute/better-together-amazon-ecs-and-aws-lambda/

    If you are looking for code, this repository implements what is discussed in the above article:

    https://github.com/awslabs/lambda-ecs-worker-pattern

    Here is a snippet we use in our Lambda function (Python) to run a Docker container from Lambda:

    result = boto3.client('ecs').run_task(
        cluster=cluster,
        taskDefinition=task_definition,
        overrides=overrides,
        count=1,
        startedBy='lambda'
    )
    

    We pass in the name of the cluster on which we want to run the container, as well as the task definition that defines which container to run, the resources it needs and so on. overrides is a dictionary/map with settings that you want to override in the task definition, which we use to specify the command we want to run (i.e. the argument to docker run). This enables us to use the same Lambda function to run a lot of different jobs on ECS.

    Hope that points you in the right direction.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.