How to use GitLab CI in combination with a JUnit Rule that launches Docker containers?

I have a set of integration tests that rely on a postgres database being available. In order for the tests to be independent, I am using this project to start a postgres docker container before the tests:

DockerRule postgresDockerRule = DockerRule
            .expose(databaseConfig.port().toString(), "5432")
            .env("POSTGRES_PASSWORD", databaseConfig.password())
            .waitForMessage("PostgreSQL init process complete; ready for start up.", 60)

This works fine locally. The rule starts up the container, the tests are run and after the tests, the container is deleted.

  • Loopback handle connection issues (and startup crashes)
  • docker build shows: failed sandbox add: failed to set gateway while updating gateway: invalid argument
  • Can't kill processes (originating in a docker container)
  • How can I detect if there's a new docker image without using docker pull?
  • Docker volume performance degradation
  • Should I be concerned about excess, non-running, Docker containers?
  • However, I am having troubles getting these tests to run on

    The tests always fail with the following exception (this is the end of a longer stacktrace):

    Caused by: No such file or directory
    at jnr.unixsocket.UnixSocketChannel.doConnect(
    at jnr.unixsocket.UnixSocketChannel.connect(
    at com.spotify.docker.client.ApacheUnixSocket.connect(
    at com.spotify.docker.client.UnixConnectionSocketFactory.connectSocket(
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(
    at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(
    at org.apache.http.impl.execchain.MainClientExec.establishRoute(
    at org.apache.http.impl.execchain.MainClientExec.execute(
    at org.apache.http.impl.execchain.ProtocolExec.execute(
    at org.apache.http.impl.execchain.RetryExec.execute(
    at org.apache.http.impl.execchain.RedirectExec.execute(
    at org.apache.http.impl.client.InternalHttpClient.doExecute(
    at org.apache.http.impl.client.CloseableHttpClient.execute(
    at org.glassfish.jersey.apache.connector.ApacheConnector.apply(
    ... 21 more

    The project providing the DockerRule uses the spotify docker client to connect to the remote API of the docker daemon. (That is why it throws an IOException stating “No such file or directory” – it cannot find the socket.)

    My .gitlab-ci.yml file looks like this:

      - build
      - deploy
      image: openjdk:8
      stage: build
        - ./gradlew clean build -Dorg.gradle.parallel=true
        when: always
          - 'rest-api/build/distributions/*.zip'
          - '*/build/reports/*'
      image: governmentpaas/cf-cli
      stage: deploy
        - cf api ...
        - cf auth ...
        - cf target -o XXX -s development
        - cf push ....
        - master

    What I would like to achieve is:

    • Integration tests are run locally and during the CI process
    • Integration tests connect to a real database
    • No difference between local and CI test configuration

    I thought about providing the postgres database as a service during the CI process using the services section of .gitlab-ci.yml. But that would mean that I have to manually start up a postgres database before I can run my integration tests locally. What I liked about the junit rule approach was that I could simply run my integration tests like any other tests by just having docker running in the background.

    I would be nice if someone can come up with a solution that allows me to connect to a docker instance during the CI process but I am also happy about ideas on how to change my overall setup of integration testing in order for this to work.

  • How to access Docker container from another machine on lan
  • Running rust on Docker: Empty reply from server
  • Kubernetes for a Development Environment
  • Proper method to run gulp build on jenkins for AWS ECS
  • Connect to HBase running in Docker
  • Restrict ssh user to their respective docker container
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.