Reputation: 23
I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes, I use the pre-trained faster_rcnn_resnet50_coco_2018_01_28
model.
I want to detect under/overfitting after training the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss.
Is this possible to plot a validation loss on Tensorboard too?
Upvotes: 2
Views: 4094
Reputation: 11
Using model_main.py for training gives two curves in tensorboard. They supposed to be train and validation losses.
you can use the following command at CMD.
python object_detection/model_main.py --num_eval_steps=10 --num_train_steps=50000 --alsologtostderr --pipeline_config_path=C:/DroneMaskRCNN/DroneMaskRCNN1/mask_rcnn_inception_v2_coco.config --model_dir=C:/DroneMaskRCNN/DroneMaskRCNN1/CP
Upvotes: 1
Reputation: 41
To see the validation curve you should change faster_rcnn_resnet50_coco.config:
1- comment max_evals
line
2- set eval_interval_secs
: 60 .
3- num_examples should be equal or less than the number of "files" that you have in "val.record" .
eval_config: { .
num_examples: 600 .
eval_interval_secs: 60 .
# Note: The below line limits the evaluation process to 10 evaluations.
# Remove the below line to evaluate indefinitely.
# max_evals: 10 .
}
Upvotes: 4
Reputation: 1912
There is validation loss. Assuming you're using the latest API, the curve under "loss" is validation loss while "loss_1/2" is the training loss.
Upvotes: 7