Tensorflow: saving a subset of variables from a previously created model

I created a model (model A) with a bunch of variables. I am planning to transfer training of Model A to a new model (Model B) using some of the layers from Model A. However, Model B has the same architecture as Model A, so I cannot just load all the variables from Model A before running Model B , otherwise there will be naming errors, etc. So, I am trying to create a new ckpt file that only stores the weights I want from model A. Then I will use this new ckpt file to load into model B I have the following:

sess = tf.Session()
saver = tf.train.import_meta_graph('ModelA.ckpt.meta')
saver.restore(sess, 'ModelA.ckpt')

# I did not explicity name my variables in model A so I am just placing them in the list and taking the ones I want

store_list = []
for v in tf.trainable_variables():
    store_list.append(v)

var_list={"W_1": store_list[0], "b_1": store_list[1]}
v2_saver=tf.train.Saver(var_list)
sess.run(tf.global_variables_initializer())
v2_saver.save(sess, 'model_A_subset.ckpt')

      

However, when restoring model_A_subset.ckpt, I still have the variables from ModelA.ckpt. Am I doing something wrong? Is there a way that I can easily remove the variables that I don't want to use ModelA.ckpt and use them?

+3


source to share


1 answer


Are you sure the unneeded variables are at the breakpoint? I ask because before restoring a checkpoint you need to create a graph, and if you create a graph with all the A variables in it, you have this problem.



To check the checkpoint and see what is really try to check the checkpoint .

0


source







All Articles