Adding low levels to Tensorflow models

When trying to develop some algorithm for learning transfers, I am using some trained neural networks and adding layers. I am using Tensorflow and python.

It seems quite common to use existing graphs in Tensorflow: you import a graph, for example using metaGraphs, and then set new high levels by adding nodes. For example, I found this code here :

vgg_saver = tf.train.import_meta_graph(dir + '/vgg/results/vgg-16.meta')
# Access the graph
vgg_graph = tf.get_default_graph()

# Retrieve VGG inputs
self.x_plh = vgg_graph.get_tensor_by_name('input:0')

# Choose some node
output_conv =vgg_graph.get_tensor_by_name('conv1_2:0')

# Build further operations
output_conv_shape = output_conv.get_shape().as_list()
W1 = tf.get_variable('W1', shape=[1, 1, output_conv_shape[3], 32],initializer=tf.random_normal_initializer(stddev=1e-1))
b1 = tf.get_variable('b1', shape=[32], initializer=tf.constant_initializer(0.1))
z1 = tf.nn.conv2d(output_conv, W1, strides=[1, 1, 1, 1], padding='SAME') + b1
a = tf.nn.relu(z1)

      

Then in the training you will use your layers and everything below. You can also freeze some layers, import trainable variables during a session, etc.

However, in my approach I need to add new low layers between the input and the first layer and use my layers plus the ones above. So I can't just add nodes at the bottom of the graph: I need to insert the nodes immediately after typing.

So far, I haven't found a convenient way to do this with tensorflow. Do you have any ideas? Or is it just not possible?

Thanks in advance.

+3


source to share


1 answer


You cannot insert layers between existing chart levels, but you can import a chart with some tweaking along the way. As Pietro Tortella pointed out, the approach in Tensorflow is: how to replace a node in the calculation graph? must work. Here's an example:

import tensorflow as tf

with tf.Graph().as_default() as g1:
    input1 = tf.placeholder(dtype=tf.float32, name="input_1")
    l1 = tf.multiply(input1, tf.constant(2.0), name="mult_1")
    l2 = tf.multiply(l1, tf.constant(3.0), name="mult_2")

g1_def = g1.as_graph_def()

with tf.Graph().as_default() as new_g:
    new_input = tf.placeholder(dtype=tf.float32, name="new_input")
    op_to_insert = tf.add(new_input, tf.constant(4.0), name="inserted_op")
    mult_2, = tf.import_graph_def(g1_def, input_map={"input_1": op_to_insert},
                                  return_elements=["mult_2"])

      

The original graph looks like this , and the imported graph looks like this .



If you want to use tf.train.import_meta_graph

, you can still go to

input_map={"input_1": op_to_insert}

      

kwarg. It will be passed to import_graph_def.

+2


source







All Articles