Design approach for hosting multiple microservices on the same host [closed]
I’m working on a Web application that I decoupled it in multiple containerized microservices. I have now around 20 services, but the whole system will definitely need more than 300. Most of the services now and some in the future will not need an entire machine so I’ll deploy multiple services on a same host. I’m wondering how others deal with interservice communication. My preferred way was to go with a REST based communication but…
Isn’t it too heavy to have multiple web servers running on the same machine? I’m developing in Ruby, but even a lightweight web server like Puma can consume a good amount of memory
I started writing a custom communication channel using UNIX sockets. So, I’d start one web server and my “router” app would communicate with the currently running services on that host through UNIX sockets. But I don’t know if it’s worth the effort and on top of that, all services have to be written and customized to use this kind of communication. I believe it would be hard to use any framework like Ruby-on-Rails or others, even different languages which is the whole appeal with microservices architecture. I feel like I’m trying to reinventing the wheel.
So, can someone suggest a better approach or vote for one of my current ones?
I appreciate any help,
One Solution collect form web for “Design approach for hosting multiple microservices on the same host [closed]”
Looks like you may want to look into docker swarm, they’re actively working on these use cases. I wouldn’t recommend building your own communication channel, stick with http or maybe use spdy if you’re really concerned about performance. Anything you introduce will make using these upcoming solutions more difficult. Also keep in mind you don’t need a heavy-duty web server in most cases, you can always introduce a layer above one or more of your services using nginx or haproxy for example.