How to use GitLab CI in combination with a JUnit Rule that launches Docker containers?

I have a set of integration tests that rely on a postgres database being available. In order for the tests to be independent, I am using this project to start a postgres docker container before the tests:

DockerRule postgresDockerRule = DockerRule
            .expose(databaseConfig.port().toString(), "5432")
            .env("POSTGRES_PASSWORD", databaseConfig.password())
            .waitForMessage("PostgreSQL init process complete; ready for start up.", 60)

This works fine locally. The rule starts up the container, the tests are run and after the tests, the container is deleted.

  • Multiple docker containers accessible by nginx reverse proxy
  • Cannot replace the software source inside docker Ubuntu image
  • How to create a docker image with passenger?
  • npm install doesn't work in Docker
  • Apache in Docker says: Symbolic link not allowed
  • What environment variables can be used with a Docker image?
  • However, I am having troubles getting these tests to run on

    The tests always fail with the following exception (this is the end of a longer stacktrace):

    Caused by: No such file or directory
    at jnr.unixsocket.UnixSocketChannel.doConnect(
    at jnr.unixsocket.UnixSocketChannel.connect(
    at com.spotify.docker.client.ApacheUnixSocket.connect(
    at com.spotify.docker.client.UnixConnectionSocketFactory.connectSocket(
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(
    at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(
    at org.apache.http.impl.execchain.MainClientExec.establishRoute(
    at org.apache.http.impl.execchain.MainClientExec.execute(
    at org.apache.http.impl.execchain.ProtocolExec.execute(
    at org.apache.http.impl.execchain.RetryExec.execute(
    at org.apache.http.impl.execchain.RedirectExec.execute(
    at org.apache.http.impl.client.InternalHttpClient.doExecute(
    at org.apache.http.impl.client.CloseableHttpClient.execute(
    at org.glassfish.jersey.apache.connector.ApacheConnector.apply(
    ... 21 more

    The project providing the DockerRule uses the spotify docker client to connect to the remote API of the docker daemon. (That is why it throws an IOException stating “No such file or directory” – it cannot find the socket.)

    My .gitlab-ci.yml file looks like this:

      - build
      - deploy
      image: openjdk:8
      stage: build
        - ./gradlew clean build -Dorg.gradle.parallel=true
        when: always
          - 'rest-api/build/distributions/*.zip'
          - '*/build/reports/*'
      image: governmentpaas/cf-cli
      stage: deploy
        - cf api ...
        - cf auth ...
        - cf target -o XXX -s development
        - cf push ....
        - master

    What I would like to achieve is:

    • Integration tests are run locally and during the CI process
    • Integration tests connect to a real database
    • No difference between local and CI test configuration

    I thought about providing the postgres database as a service during the CI process using the services section of .gitlab-ci.yml. But that would mean that I have to manually start up a postgres database before I can run my integration tests locally. What I liked about the junit rule approach was that I could simply run my integration tests like any other tests by just having docker running in the background.

    I would be nice if someone can come up with a solution that allows me to connect to a docker instance during the CI process but I am also happy about ideas on how to change my overall setup of integration testing in order for this to work.

  • how to use pip to install pkg from requirement file without reinstall
  • Suppressing interactive output in Gradle exec tasks
  • How to get stdin from php for java docker container at runtime
  • Task on marathon never ends
  • Remote debugging NodeJS in Docker with Visual Studio Code
  • Docker Build: mkdir creates folders, but they disappear in runtime
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.