Warm tip: This article is reproduced from stackoverflow.com, please click
keras python

Output tensors to a Model must be the output of a Keras `Layer` (thus holding past layer metadata).

发布于 2020-04-05 23:34:29

I am trying to do a simple cnn-lstm classification with time distributed but I am getting the following error: Output tensors to a Model must be the output of a Keras Layer (thus holding past layer metadata). Found:

my samples are grayscaled images of 366 channels and 5x5 size each sample has its own unique label.

model_input = Input(shape=(366,5,5))

model = TimeDistributed(Conv2D(64, (3, 3), activation='relu', padding='same',data_format='channels_first')(model_input))
model = TimeDistributed(MaxPooling2D((2, 2),padding='same',data_format='channels_first'))

model = TimeDistributed(Conv2D(128, (3,3), activation='relu',padding='same',data_format='channels_first'))
model = TimeDistributed(MaxPooling2D((2, 2), strides=(2, 2),padding='same',data_format='channels_first'))


model = Flatten()

model = LSTM(256, return_sequences=False, dropout=0.5)
model =  Dense(128, activation='relu')


model = Dense(6, activation='softmax')

cnnlstm = Model(model_input, model)
cnnlstm.compile(optimizer='adamax',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])
cnnlstm.summary()
Questioner
MBS
Viewed
66
Matias Valdenegro 2020-02-01 01:24

You have to pass tensors between the layers as this is how the Functional API works, for all layers, using the Layer(params...)(input) notation:

model_input = Input(shape=(366,5,5))

model = TimeDistributed(Conv2D(64, (3, 3), activation='relu', padding='same',data_format='channels_first'))(model_input)
model = TimeDistributed(MaxPooling2D((2, 2),padding='same',data_format='channels_first'))(model)

model = TimeDistributed(Conv2D(128, (3,3), activation='relu',padding='same',data_format='channels_first'))(model)
model = TimeDistributed(MaxPooling2D((2, 2), strides=(2, 2),padding='same',data_format='channels_first'))(model)


model = TimeDistributed(Flatten())(model)

model = LSTM(256, return_sequences=False, dropout=0.5)(model)
model =  Dense(128, activation='relu')(model)


model = Dense(6, activation='softmax')(model)

cnnlstm = Model(model_input, model)

Note that I have also corrected the first TimeDistributed layer, as the tensor was in the wrong part.