How to get CNN core values ​​in Tensorflow

I am using the following code to create CNN layers.

conv1 = tf.layers.conv2d(inputs = input, filters = 20, kernel_size = [3,3],
    padding = "same", activation = tf.nn.relu)

      

and I want to get the values ​​of all nuclei after training. It doesn't work, I just do

kernels = conv1.kernel

      

So how do I get the value of these kernels? I'm also not sure what variables and methods conv2d does since tensorflow doesn't really talk about it in the conv2d class.

+2


source to share


2 answers


You can find all the variables in the list returned tf.global_variables()

and find the required variable easily.

If you want to get these variables by name, declare the layer as:

conv_layer_1 = tf.layers.conv2d(activation=tf.nn.relu, 
                                filters=10, 
                                inputs=input_placeholder, 
                                kernel_size=(3, 3), 
                                name="conv1",         # NOTE THE NAME 
                                padding="same", 
                                strides=(1, 1))

      

Rebuild the graph as:

gr = tf.get_default_graph()

      



Restore the kernel values ​​as:

conv1_kernel_val = gr.get_tensor_by_name('conv1/kernel:0').eval()

      

Reconstruct the offset values ​​as:

conv1_bias_val = gr.get_tensor_by_name('conv1/bias:0').eval()

      

+5


source


You want you to get the weight value for the conv1 layer.

You haven't actually defined the weights with conv2d, you need to do that. When I create a convolutional layer, I use a function that does all the necessary steps, here's a copy / paste function that I use to create each of my convolutional layers:

def _conv_layer(self, name, in_channels, filters, kernel, input_tensor, strides, dtype=tf.float32):
    with tf.variable_scope(name):
        w = tf.get_variable("w", shape=[kernel, kernel, in_channels, filters],
                            initializer=tf.contrib.layers.xavier_initializer_conv2d(), dtype=dtype)
        b = tf.get_variable("b", shape=[filters], initializer=tf.constant_initializer(0.0), dtype=dtype)
        c = tf.nn.conv2d(input_tensor, w, strides, padding='SAME', name=name + "c")
        a = tf.nn.relu(c + b, name=name + "_a")
        print name + "_a", a.get_shape().as_list(), name + "_w", w.get_shape().as_list(), \
            "params", np.prod(w.get_shape().as_list()[1:]) + filters
        return a, w.get_shape().as_list()

      

This is what I use to define 5 convolutional layers, this example is straight from my code, so please note that it consists of 5 convolutional layers without using max union or anything else, 2 and 5x5 core steps.



    conv1_a, _ = self._conv_layer("conv1", 3,     24, 5, self.imgs4d, [1, 2, 2, 1])   # 24.8 MiB/feature -> 540 x 960
    conv2_a, _ = self._conv_layer("conv2", 24,    80, 5,     conv1_a, [1, 2, 2, 1])   #  6.2 MiB         -> 270 x 480
    conv3_a, _ = self._conv_layer("conv3", 80,   256, 5,     conv2_a, [1, 2, 2, 1])   #  1.5 MiB         -> 135 x 240
    conv4_a, _ = self._conv_layer("conv4", 256,  750, 5,     conv3_a, [1, 2, 2, 1])   #  0.4 MiB         ->  68 x 120
    conv5_a, _ = self._conv_layer("conv5", 750, 2048, 5,     conv4_a, [1, 2, 2, 1])   #  0.1 MiB         ->  34 x  60

      

There is also a nice tutorial on tensorflow website on how to set up a convolutional network:

https://www.tensorflow.org/tutorials/deep_cnn

The direct answer to your question is that the weights for the convolutional layer are defined there as w

that tensor you ask if I understand you correctly.

0


source







All Articles