Is it possible to execute CMD at the middle of docker file?

I am installing hadoop-0.20.2 using docker. I have two files one is for java installation and another is for hadoop installation. I am starting services using CMD command

 cmd ["path/to/start-all.sh"]

Now, i want to to write third dockerfile which executes an example Map-Reduce job. But the problem is

  • Log management of various docker containers
  • Docker remove not working
  • Installing docker-ce through puppet
  • Mesos, Marathon, Docker, Wildfly
  • docker cannot start container
  • Connecting from a docker container to the host?
  • Third docker file depends on second hadoop-docker file. fo eg:

     FROM sec_doc_file
    
     RUN /bin/hadoop fs -mkdir input
    

    It requires hadoop services. But hadoop services ll be started only after running second docker file. But i want to run it as part of third docker file before starting MR job? Is it possible? If so, please provide an example. If not, what could be the other possibilities?

     #something like
    
     From sec_doc_file
    
     #Start_Service
    
     RUN /bin/hadoop fs -mkdir input
    
     #continue_map_reduce_job
    

  • Push to Dockerhub sends entire image instead of just the changes
  • Is Docker a replacement for git source control? [closed]
  • fig up fails with error “create_container() takes at least 2 arguments”
  • Can't launch openresty docker
  • Creating a docker image for windows that requires execution of setup executables
  • ECS Docker Container get env variable running cron
  • One Solution collect form web for “Is it possible to execute CMD at the middle of docker file?”

    The docker image you use as base for the new container is a base for files, not for processes supposed to be running. To do what you want you would need to start the process(es) you need during dockerbuild and run the commands to set up properly. Each RUN creates a new AUFS layer, but does not keep the possible previous running services. So, if you need a service to be up to perform some setup during docker build you would need to run it in one line (concatenating commands or with a custom script). Example:

    FROM Gops/sec_doc_file
    RUN path/to/start-all.sh && /bin/hadoop fs -mkdir input
    

    So for setting up HDFS folders and files during docker build you’d need to run the hdfs daemons and perform the action you wish in the same RUN command:

    RUN /etc/hadoop/hadoop-env.sh &&\
        /opt/hadoop/sbin/start-dfs.sh &&\
        /opt/hadoop/bin/hdfs dfs -mkdir input
    
    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.