Host Global Python Object in Docker Container

I have an object in Python thats pretty big (~10gb). I want to host the object in a Docker container and expose some of its methods to other docker containers, which is possible through TCP. Most web application frameworks doesn’t guarantee a single main process where the object can live in peace. How should I go about it?
The problem is even analogous to having global state in applications such as Celery and Flask. The object is immutable so it’s sort of like a database, but I’d really like it to exist as a Python object.

  • Docker: error creating aufs mount to /var/lib/docker/aufs/mnt/15396ee0f38d161382f104e11c94b6ca0efafe10f9952e1dfba4f548009fbe59-init: invalid argument
  • DRY Config for Docker build and App
  • Addressing multiple non docker-swarm services on LAN from within swarm
  • Docker: Creating a data volume container vs simply using the -v flag with `run`?
  • Why is sbt-native-packager staging bin/start as a directory instead of a script?
  • Creating User In Docker Container
  • How to access to container's logs via GET request to the NodeJS/express?
  • php support for mongoDB
  • Installing osixia/docker-openldap using docker-compose
  • Docker client freeze
  • docker stop command not working correctly
  • How to use docker for deployment and development?
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.