Integrate private files for open source application code in devops pipeline

I have a Java Application that serves as the Backend to our entire project and all of our software is fully open source. The problem is as follows:

We use several API keys from Google, SendGrid to MySQL Database authentication keys. All of these are stored in the application.properties file. The code we have on GitHub is ditto except that these files are missing. The workaround currently is they are only available on the Heroku remote through which we deploy to Heroku. Whenever we wish to deploy a change, we switch to the heroku branch, pull changes from master and rebase this branch and finally push it.

  • Connecting to ElasticSearch plugin under proxy settings
  • How to mimic Travis CI locally?
  • NFS or mounting shares with docker
  • How do I model a PostgreSQL failover cluster with Docker/Kubernetes?
  • REST request from one docker container to another fails
  • Docker tools Error while recreating Hostonly network
  • We’re soon shifting to cycle.io runs containerized applications, so basically we’re shifting to Docker. We’re going to have the appliation dockerized and a MySQL Docker instance deployed and communicating with each other. To quote from Cycle

    An environment allows you to organize and communicate between your
    containers. When you click ‘start’ on a container from the enviroment,
    the networking automatically configures, and notify any other
    containers within the environment of its presence.

    Cycle.io allows us to automatically link either images from Docker Hub or have a git repository with a Dockerfile. It picks it up from there and then automatically deploys it.

    My Question is, how do I integrate this private files in my code-build-test-deploy pipeline?. Naively, I was thinking of some way that’injects’ these files before building. I have no clue how to do so. My current idea for docker deployment was

    • Build a docker image on my local system that works
    • Push it to Docker Hub to a private repository and allow cycle.io, which integrates with docker hub to pick it up from there
    • (I wanna integrate Tavis/CI too, idk how)

    Is there any SENSIBLE, or for the lack of another word, PRO way of doing this? We’re a group of students looking to follow best practices and this application is going to scale big. We’ve gotten ourselves fully paid for deployment and domain name passes etc already but we don’t really have any expert guidance.

  • docker commit apache container to image brings trouble
  • Flannel network partitions
  • uwsgi running in Docker can't find initializtion file
  • Dockerfile contains a python script that writes to an output file, but the output file is not being created on the container
  • CrashLoopBackOff in spark cluster in kubernetes: nohup: can't execute '--': No such file or directory
  • Dockerized .net core app doesn't load on Azure
  • One Solution collect form web for “Integrate private files for open source application code in devops pipeline”

    Cycle allows you to configure environment variables per environment, and they are treated the same way environment variables on your local machine would be treated. This means you can configure for example, the SENDGRID_API_KEY variable, set it in Cycle to your API key (Navigate to Environments -> Container -> Config) and have your code read that variable directly.

    You can declare environment variables that exist in your dockerfile (ENV SENDGRID_API_KEY=""), and Cycle will pre-populate those so that you can just fill in the box and hit save. You’ll need to restart the container, but then your variables should be available to your program without you having to manage moving files around.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.