How to show both loss of training and loss of validation on the same graph in tensor board via keras?
I am using Keras with Tensorflow trailing end for CNN training, and I am using tensogram to visualize loss and precision functions. I would like to see the loss function of both training data and validation data on the same graph, but I only found ways to do this when using Tensorflow and not through keras.
Is there a way to do this?
Edit 1: I tried to write loss / acc in Regex, but instead of showing both diagrams together, they show them side by side: http://imgur.com/a/oLIcL
I've added what I am using to register on a tensor board:
tbCallBack=keras.callbacks.TensorBoard(log_dir='C:\\logs', histogram_freq=0, write_graph=False, write_images=True, embeddings_freq=0, embeddings_layer_names=None, embeddings_metadata=None)
model.fit_generator(train_generator,
steps_per_epoch=x_train.shape[0] // batch_size,
epochs=epochs,
validation_data=(x_test, y_test))
source to share
You can add regex to the text box at the top left corner of the Tensorboard window.
Add acc
for train data accuracy / validation. Add values loss
for loss values. This works for me for both Keras and Tensorflow.
Got it from this good tuberculosis tutorial: https://www.youtube.com/watch?v=eBbEDRsCmv4
As a piece of code, I use this:
logdir = "_tf_logs/" + now.strftime("%Y%m%d-%H%M%S") + "/"
tb = TensorBoard(log_dir=logdir)
callbacks=[tb]
...
model.fit(X_train, Y_train, validation_data=val_data, epochs=10, verbose=2, callbacks=callbacks)
source to share