Keras Backend Modeling Problem

I have a problem with my model declaration. My inputs are x_input and y_input and my outputs are predictions. As shown below:

model = Model(inputs = [x_input, y_input], outputs = predictions )

      

My inputs (x, y) are inline then MatMult together. As shown below:

# Build X Branch
x_input = Input(shape = (maxlen_x,), dtype = 'int32' )                               
x_embed = Embedding( maxvocab_x + 1, 16, input_length = maxlen_x )
XE = x_embed(x_input) 
# Result: Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
# Where 31 happens to be my maxlen_x

      

Similarly for the y ...

# Build Y Branch
y_input = Input(shape = (maxlen_y,), dtype = 'int32' )                               
y_embed = Embedding( maxvocab_y + 1, 16, input_length = maxlen_y )
YE = y_embed(y_input) 
# Result: Tensor("embedding_1/Gather:0", shape=(?, 13, 16), dtype=float32)
# Where 13 happens to be my maxlen_y

      

Then I make a batch dot in between. (Just drawing data from each instance)

from keras import backend as K
dot_merged = K.batch_dot(XE, YE, axes=[2,2] ) # Choose the 2nd component of both inputs to Dot, using batch_dot 
# Result: Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)`

      

Then I flattened the last two dimensions of the tensor.

dim = np.prod(list(dot_merged.shape)[1:]) 
flattened= K.reshape(dot_merged, (-1,int(dim)) )

      

I ended up feeding this flattened data into a simple logistic regression.

predictions = Dense(1,activation='sigmoid')(flattened)

      

And, my predictions are, of course, my conclusion for the model.

I will list the output of each layer by the shape of the tensor output.

Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
Tensor("embedding_2/Gather:0", shape=(?, 13, 16), dtype=float32)
Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)
Tensor("Reshape:0", shape=(?, 403), dtype=float32)
Tensor("dense_1/Sigmoid:0", shape=(?, 1), dtype=float32)

      

Specifically, I am getting the following error.

    Traceback (most recent call last):
  File "Model.py", line 53, in <module>
    model = Model(inputs = [dx_input, rx_input], outputs = [predictions] )
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 88, in wrapper
    return func(*args, **kwargs)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1705, in __init__
    build_map_of_graph(x, finished_nodes, nodes_in_progress)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1695, in build_map_of_graph
    layer, node_index, tensor_index)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1665, in build_map_of_graph
    layer, node_index, tensor_index = tensor._keras_history
AttributeError: 'Tensor' object has no attribute '_keras_history'

      

WILL. Where am I wrong? Thanks for any help in advance!

-Anthony

0


source to share


1 answer


Have you tried wrapping the backend functions in a layer Lambda

? I think there are some necessary operations in the Keras layer method __call__()

for Keras Model

that need to be structured correctly that will not work if you call the internal functions directly.



+1


source







All Articles