CannotStartContainerError while submitting a AWS Batch Job

In AWS Batch I have a job definition and a job queue and a compute environment where to execute my AWS Batch jobs.
After submitting a job, I find it in the list of the failed ones with this error:

Status reason
Essential container in task exited
Container message
CannotStartContainerError: API error (404): oci runtime error: container_linux.go:247: starting container process caused "exec: \"/var/application/ --file= --key=. 

and in the cloudwatch logs I have:

  • Angular 2 app deploy in a docker Container running on AWS
  • Docker-ized Consul, Zookeeper and Kafka in Amazon-ECS
  • Node ECS Task Not Crashing
  • Docker Nginx stopped: [emerg] 1#1: host not found in upstream
  • How to link from docker-compose to Amazon RDS
  • Bunyan logger on aws leaks file descriptors
  • container_linux.go:247: starting container process caused "exec: \"/var/application/ --file=Toulouse.json --key=out\": stat /var/application/ --file=Toulouse.json --key=out: no such file or directory"

    I have specified a correct docker image that has all the scripts (we use it already and it works) and I don’t know where the error is coming from.
    Any suggestions are very appreciated.

    The docker file is something like that:

    # Pull base image.
    VOLUME /tmp
    VOLUME /mount-point
    RUN chown -R ubuntu:ubuntu /var/application
    # Create the source directories
    USER ubuntu
    COPY application/ /var/application
    # Register aws profile
    COPY data/aws /home/ubuntu/.aws
    WORKDIR /var/application/
    RUN composer update -o && \
        rm -Rf /tmp/*

    Here is the Job Definition:

        "jobDefinitionName": "JobDefinition",
        "jobDefinitionArn": "arn:aws:batch:region:accountid:job-definition/JobDefinition:25",
        "revision": 21,
        "status": "ACTIVE",
        "type": "container",
        "parameters": {},
        "retryStrategy": {
            "attempts": 1
        "containerProperties": {
            "image": "",
            "vcpus": 1,
            "memory": 512,
            "command": [
            "volumes": [
                    "host": {
                        "sourcePath": "/mount-point"
                    "name": "logs"
                    "host": {
                        "sourcePath": "/var/log/php/errors.log"
                    "name": "php-errors-log"
                    "host": {
                        "sourcePath": "/tmp/"
                    "name": "tmp"
            "environment": [
                    "name": "APP_ENV",
                    "value": "dev"
            "mountPoints": [
                    "containerPath": "/tmp/",
                    "readOnly": false,
                    "sourceVolume": "tmp"
                    "containerPath": "/var/log/php/errors.log",
                    "readOnly": false,
                    "sourceVolume": "php-errors-log"
                    "containerPath": "/mount-point",
                    "readOnly": false,
                    "sourceVolume": "logs"
            "ulimits": []

    In Cloudwatch log stream /var/log/docker:

    time="2017-06-09T12:23:21.014547063Z" level=error msg="Handler for GET /v1.17/containers/4150933a38d4f162ba402a3edd8b7763c6bbbd417fcce232964e4a79c2286f67/json returned error: No such container: 4150933a38d4f162ba402a3edd8b7763c6bbbd417fcce232964e4a79c2286f67" 

  • docker-machine on mac does not work with docker-engine protected by self-create CA
  • Docker service registering into Consul but health check is failing
  • can not ping as normal user in Docker centos image
  • Can docker services only be attached to docker overlay networks?
  • Docker Quickstart Terminal: exit status 255
  • How to share dynamically generated secrets between Docker containers
  • One Solution collect form web for “CannotStartContainerError while submitting a AWS Batch Job”

    This error was because the command was malformed. I was submitting the job by a lambda function (python 2.7) using boto3 and the syntax of the command should be something like this:

    'command' : ['sudo','mkdir','directory']

    Hope it helps somebody.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.