Is it possible to directly call docker run from AWS lambda

I have a Java standalone application which I have dockerized. I want to run this docker everytime an object is put into S3 storage. On way is to do it via AWS batch which I am trying to avoid.

Is there a direct and easy way to call docker run from a lambda?

  • How to manage deployment?
  • Error “python: not found” in Dockerized PHP application hosted on Elastic Beanstalk
  • Private docker registry authentication in aws ecs optimized AMI is not successful
  • Does AWS support Kubernetes?
  • Migration of docker image from AWS to Bluemix or Azure
  • How to mount file from host to docker container on ECS
  • Building a docker image using AWS Developer Tools
  • How do I configure “ulimits” for a Docker container running in AWS ECS?
  • Is there a docker image which is similar (or the same) as AWS EC2 AMI?
  • How to set up autoscaling RabbitMQ Cluster AWS
  • kubernetes and debugging it in general
  • What is the difference between kubernetes and GKE?
  • One Solution collect form web for “Is it possible to directly call docker run from AWS lambda”

    Yes and no.

    What you can’t do is execute docker run to run a container within the context of the Lambda call. But you can trigger a task on ECS to be executed. For this to work, you need to have a cluster set up on ECS, which means you need to pay for at least one EC2 instance. Because of that, it might be better to not use Docker, but I know too little about your application to judge that.

    There are a lot of articles out there how to connect S3, Lambda and ECS. Here is a pretty in-depth article by Amazon that you might be interested in:

    https://aws.amazon.com/blogs/compute/better-together-amazon-ecs-and-aws-lambda/

    If you are looking for code, this repository implements what is discussed in the above article:

    https://github.com/awslabs/lambda-ecs-worker-pattern

    Here is a snippet we use in our Lambda function (Python) to run a Docker container from Lambda:

    result = boto3.client('ecs').run_task(
        cluster=cluster,
        taskDefinition=task_definition,
        overrides=overrides,
        count=1,
        startedBy='lambda'
    )
    

    We pass in the name of the cluster on which we want to run the container, as well as the task definition that defines which container to run, the resources it needs and so on. overrides is a dictionary/map with settings that you want to override in the task definition, which we use to specify the command we want to run (i.e. the argument to docker run). This enables us to use the same Lambda function to run a lot of different jobs on ECS.

    Hope that points you in the right direction.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.