Using Nginx as micro service API gateway

We are splitting our monolith API into micro services.

We do not need rate-limiting, auth, caching or any other gateway like abilities.

  • How to have my docker daemon log print to logfile not to STDOUT?
  • Docker API: Listening to events
  • How to set a specfic ip address with docker-compose who's accessible
  • Container does not start if I map existing in container dir
  • How to start a mongodb shell in docker container?
  • Load TensorFlow weights on docker container
  • Would it be a valid approach to use very simple stateless Nginx containers that route to the underlying services?

  • Installing Java on docker in docker image
  • Docker Java EE container not starting
  • Why does Docker keep around the stopped containers?
  • docker container does not start Java in background
  • MySQL bind-address in a Docker container
  • Why can't apache communicate with php-fpm in separate containers using Docker for Windows?
  • 2 Solutions collect form web for “Using Nginx as micro service API gateway”

    Yes, Nginx can be a deployment and a service (of loadbalancer or externalIP type) and can forward to upstream services.

    You might have to frequently change the nginx.conf though (when you add/remove services), so I would recommend using a ConfigMap to keep your nginx.conf and mounting that as a volume in your deployment. Refer: http://kubernetes.io/docs/user-guide/configmap/ and scroll down to consume configmap via volumes.

    Another thing to keep in mind is that if you delete and create a service that is referred to in nginx.conf as an upstream service, you’ll have to restart your deployment because nginx resolves all service DNS labels when nginx starts.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.