De-allocating memory after python tensorflow workbook execution

To limit memory usage I read How to prevent tensorflow from allocating the totality of a GPU memory? and tried this code :

# Assume that you have 12GB of GPU memory and want to allocate ~4GB:
gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.333)
sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))

These commands did free up memory but but memory is not de-allocated after code completion. This issue describes : a suggested fix is to update the driver
“After upgrading the GPU driver from 352.79 to 367.35 (the newest one), the problem disappeared. “
Unfortunately I’m not in position to update to latest version of driver. Has this issue been resolved.

  • PostgreSQL Docker installation how to setup initial database
  • Add entries to containers /etc//hosts When spinning up a pod with pod1.yaml or pod1.json
  • Docker inter container communication not working
  • Docker container sending empty responses
  • Docker Registry 2.0 enable CORS
  • Is this Docker / NGINX / Node setup actually load balancing as expected?
  • I also considered limiting the available memory to the docker container.
    Reading states “Containers can be constrained to a limited set of resources on a system (e.g one CPU core and 1GB of memory)” but kernel does not currently support this, here I try to add 1GB of memory to new docker instance :

    nvidia-docker run -m 1024m -d -it -p 8889:8889 -v /users/user1234/jupyter:/notebooks --name tensorflow-gpu-1GB tensorflow/tensorflow:latest-cpu

    But this does not appear possible as receive warning :
    WARNING: Your kernel does not support swap limit capabilities, memory limited without swap.”

    Is there a command to free memory after tensorflow python workbook completion ?


    After killing / restarting the notebook the memory is de-allocated. But how to free memory after completion within the notebook.

  • Port forwarding in when running a Tomcat Docker in an AWS Elastic Beanstalk application
  • Running non-www stuff on an Elastic Beanstalk Docker container
  • Can (should) Docker be used for winforms applications?
  • Remove an image from hub
  • ELF Header or installation issue with bcrypt in Docker container
  • Docker images physical storage place
  • One Solution collect form web for “De-allocating memory after python tensorflow workbook execution”

    Ipython and jupyter notebooks will not free memory unless you use del or xdel on your objects:

    Delete a variable, trying to clear it from anywhere that IPython’s machinery has references to it. By default, this uses the identity of the named object in the user namespace to remove references held under other names. The object is also removed from the output history.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.