Tag: dockerfile

Intermittent slow response from Dockerized NodeJS app

I have a simple NodeJS app on top of ElasticSearch (which runs on a separate machine). When I start the app using pm2 and sequentially do some random requests, performance is fine. However, when I Dockerize the app, about one in ten responses is very slow (about 5 seconds instead of 50 ms). This is […]

permission denied error in docker

I create the docker image but I am trying to give a parameter to the docker image but I am getting an permission denied error on my entry.sh file. Docker File: FROM ubuntu MAINTAINER ravat RUN mkdir -p /home/ravata/Desktop/DockerDemo/ RUN echo “Hello Apache server on Ubuntu Docker” > /home/ravata/Desktop/DockerDemo/index.html COPY entry.sh . CMD /bin/bash entry.sh […]

CHMOD in dockerfile for python

I have a python script that downloads another script and executes it. Unfortunately the second script can’t be downloaded due to the lack of permission. Error: retrbinary(“RETR ” + filename ,open(filename2017-06-14 14:16:28 [APP/PROC/WEB/0] ERR PermissionError: [Errno 13] Permission denied: ‘aa.py’ I tried to give CHMOD permission by using this dockerfile: FROM python:3 ADD ftapp.py / […]

Docker Cloud autotest cant find service

I am currently trying to dockerize one of my Django API projects. It uses postgres as the database. I am using Docker Cloud as a CI so that I can build, lint and run tests. I started with the following DockerFile # Start with a python 3.6 image FROM python:3.6 ENV PYTHONUNBUFFERED 1 ENV POSTGRES_USER […]

File permission error when installing Meteor in Docker

I am new to Docker (2 weeks) but I’m making good progress writing a Dockerfile for a Meteor installation. Now I am seeing file permission errors with the last line. Honestly I think the wider problem is that I don’t fully understand the Linux file-system and permissions; so I would be grateful for any pointers. […]

Configuring application in docker container to access other containers

How can you configure an app deployed as a docker container to reference components running in other containers? I have a node app that requires postgresql. I have a node config file that contains all the connection information for postgres. In a non-docker deployment, you simply set the required config values (e.g. IP address of […]

How to access tomcat server of windows to building Docker image from Docker file without downloading new tomcat?

I am new to Docker. I have created docker file for .war file. When I am building Docker image from Docker file. It start downloading tomcat8 But, I want to use Tomcat that i already have in my windows for creating docker image from docker file rather than downloading from internet. Docker File FROM tomcat:8.5.11-jre8 […]

Multi PostgreSQL DB in data only container

Im trying to hook up and contain my application in docker containers. One problem I have been having is configuring the DB for the application. I have been reading with conflicting results that the data-only-containers (DOC) should be used. And in other places they shouldn’t be used. So when should they be used? I also […]

Service 'web' failed to build: lstat apache/sites-enabled/000-default.conf: no such file or directory

I want to # Update the default apache site with the config we created. COPY ./apache/sites-enabled/000-default.conf /etc/apache2/sites-enabled/000-default.conf I know my 000-default.conf is not in the same directory as Dockerfile that is why I want to hit it like this ./apache/… but I still got the error. Structure: dockerizing-django ├── apache │   └── sites-enabled │   └── […]

CrashLoopBackOff in spark cluster in kubernetes: nohup: can't execute '–': No such file or directory

Dockerfile: FROM openjdk:8-alpine RUN apk update && \ apk add curl bash procps ENV SPARK_VER 2.1.1 ENV HADOOP_VER 2.7 ENV SPARK_HOME /opt/spark # Get Spark from US Apache mirror. RUN mkdir -p /opt && \ cd /opt && \ curl http://www.us.apache.org/dist/spark/spark-${SPARK_VER}/spark-${SPARK_VER}-bin-hadoop${HADOOP_VER}.tgz | \ tar -zx && \ ln -s spark-${SPARK_VER}-bin-hadoop${HADOOP_VER} spark && \ echo Spark […]

Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.