我的Keras模型没有学习任何东西,我无法弄清楚原因.我甚至将训练集大小减少到5个元素,并且模型仍然不适合训练数据.
loss function visualized with TensorBoard
这是我的代码:
model = Sequential()
model.add(Conv1D(30, filter_length=3, activation='relu', input_shape=(50, 1)))
model.add(Conv1D(40, filter_length=(3), activation='relu'))
model.add(Conv1D(120, filter_length=(3), activation='relu'))
model.add(Flatten())
model.add(Dense(1024, activation='relu'))
model.add(Dense(256, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(1, activation='relu'))
model.summary()
model.compile(loss='mse',
optimizer=keras.optimizers.adam())
train_limit = 5
batch_size = 4096
tb = keras.callbacks.TensorBoard(log_dir='./logs/' + run_name + '/',
histogram_freq=0, write_images=False)
tb.set_model(model)
model.fit(X_train[:train_limit], y_train[:train_limit],
batch_size=batch_size,
nb_epoch=10**4,
verbose=0,
validation_data=(X_val[:train_limit], y_val[:train_limit]),
callbacks=[tb])
score = model.evaluate(X_test, y_test, verbose=0)
print('Test loss:', score)
print('Test accuracy:', score)
任何帮助是极大的赞赏!
最佳答案 这似乎是一个回归问题.我注意到的一件事是你的最后一层仍然有ReLU激活功能.我建议在最后一层取出ReLU.