De-allocating memory after python tensorflow workbook execution

To limit memory usage I read How to prevent tensorflow from allocating the totality of a GPU memory? and tried this code :

# Assume that you have 12GB of GPU memory and want to allocate ~4GB:
gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.333)
sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))

These commands did free up memory but but memory is not de-allocated after code completion. This issue describes : a suggested fix is to update the driver
“After upgrading the GPU driver from 352.79 to 367.35 (the newest one), the problem disappeared. “
Unfortunately I’m not in position to update to latest version of driver. Has this issue been resolved.

  • Why can I see the docker container process when I do a “ps aux” on the host?
  • Error response from daemon: chtimes /var/lib/docker/tmp/docker-export-$: invalid argument
  • Docker push fails with fatal error msg=“”
  • run docker on overlay network failed
  • How to build docker images for windows using VS2017 and Docker integration?
  • dotnet aspnetcore docker build fails with a 145 error code
  • I also considered limiting the available memory to the docker container.
    Reading states “Containers can be constrained to a limited set of resources on a system (e.g one CPU core and 1GB of memory)” but kernel does not currently support this, here I try to add 1GB of memory to new docker instance :

    nvidia-docker run -m 1024m -d -it -p 8889:8889 -v /users/user1234/jupyter:/notebooks --name tensorflow-gpu-1GB tensorflow/tensorflow:latest-cpu

    But this does not appear possible as receive warning :
    WARNING: Your kernel does not support swap limit capabilities, memory limited without swap.”

    Is there a command to free memory after tensorflow python workbook completion ?


    After killing / restarting the notebook the memory is de-allocated. But how to free memory after completion within the notebook.

  • How do I configure Laradock IP?
  • Can't mount all files to docker container volume
  • Run a command on a container from inside another one
  • Run Chromium inside container: libGl error
  • Gradle task for Java playground
  • Proper way to use Dropwizard dbi onDemand
  • One Solution collect form web for “De-allocating memory after python tensorflow workbook execution”

    Ipython and jupyter notebooks will not free memory unless you use del or xdel on your objects:

    Delete a variable, trying to clear it from anywhere that IPython’s machinery has references to it. By default, this uses the identity of the named object in the user namespace to remove references held under other names. The object is also removed from the output history.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.