Reputation: 2431
I'm trying to visualize multiple metrics, while using only one as the objective. I see how you can define 'custom' metrics using 'MetricDefinitions' under 'AlgorithmSpecification', but what if we just want to see more of the following and record them in CloudWatch as our HyperParameter tuning job progresses:
validation:accuracy
validation:auc
validation:error
validation:logloss
validation:mse
There are more, of course, and the exact metrics I realize might vary based on whether it's a classification or regression problem.
The larger question is just how do we specify the 'recording/logging' of more of these metrics using a standard container like the one for XGBoost?
Upvotes: 0
Views: 552
Reputation: 484
You can monitor a set of metrics for each problem type by using SageMaker Model Monitor[ Reference ] If you need to monitor any additional metrics apart from the default ones, you can use the BYOC approach to monitor them.
Some of the examples for building BYOC can be found here
Upvotes: 1