Reputation: 5745
I want to tune a hyperparameter in slightly modified DNNClassifier
. I was able to run the tuning job and it succeeded too but the output does not show the final metrics for each trial. This is what the final output looks like:
{
"completedTrialCount": "2",
"trials": [
{
"trialId": "1",
"hyperparameters": {
"myparam": "0.003"
}
},
{
"trialId": "2",
"hyperparameters": {
"myparam": "0.07"
}
}
],
"consumedMLUnits": 1.48,
"isHyperparameterTuningJob": true
}
how do I get the final metric for each trial so as to decide which value is the best?
My code looks like this.
My DNNClassifier:
classifier = DNNClassifier(
feature_columns=feature_columns,
hidden_units=hu,
optimizer=tf.train.AdamOptimizer(learning_rate=lr),
activation_fn=tf.nn.leaky_relu,
dropout=dr,
n_classes=2,
config=self.get_run_config(),
model_dir=self.model_dir,
weight_column=weight_column
)
tf.contrib.estimator.add_metrics(classifier, compute_metrics)
def compute_metrics(labels, predictions):
return {'my-roc-auc': tf.metrics.auc(labels, predictions)}
The hyperparameters spec is as follows.
trainingInput:
hyperparameters:
hyperparameterMetricTag: my-roc-auc
maxTrials: 2
enableTrialEarlyStopping: True
params:
- parameterName: myparam
type: DISCRETE
discreteValues:
- 0.0001
- 0.0005
- 0.001
- 0.003
- 0.005
- 0.007
- 0.01
- 0.03
- 0.05
- 0.07
- 0.1
I mostly followed the instructions here.
Upvotes: 1
Views: 141
Reputation: 5745
Fixed it. The issue was
tf.contrib.estimator.add_metrics(classifier, compute_metrics)
It should have been
classifier = tf.contrib.estimator.add_metrics(classifier, compute_metrics)
Upvotes: 1