Tensorflow stores all files. How can this be prevented?

Since the upgrade to tensoflow version 1.0, which introduced the new Saver V2, tf no longer deletes old files with the 'max_to_keep' argument. This is a problem on my system, as my models are quite large, but my free space is limited.

Using the dummy program below, I get the following files for each number from 1 to 10, while I expect the last 3 (8,9,10) to actually be there.

  • TestFile-1.data-00000-of-00001
  • TestFile-1.index
  • TestFile-1.meta li>

Program:

import tensorflow as tf

a = tf.Variable(name='a', initial_value=0)
addops = a+1

saver = tf.train.Saver(max_to_keep=3)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(config=config)
sess.run(tf.global_variables_initializer())
for i in range(10):
    sess.run(addops)
    save_path = saver.save(sess, 'testfile', global_step=i+1)

sess.close()

      

Is it just me or is this a mistake? What are the possible problems that could lead to this misbehavior? Is there any magazine or something like that I could get more information?

+3


source to share


1 answer


I can reproduce this. This seems to be a bug.

However, the problem goes away after saving to a different location (other than the .py executable path).



    save_path = saver.save(sess, 'data/testfile', global_step=i+1)

      

+1


source







All Articles