Reputation: 391
Does anyone know what's the mechanism behind hyperparameter tuning job in AWS Sagemaker?
In specific, I am trying to do the following:
My question is when we define the hyper parameter in HyperParameterTuner
class, does that get copied into /opt/ml/input/config/hyperparameters.json
?
If so, should one adjust the training image so that it uses the hyper parameters from /opt/ml/input/config/hyperparameters.json
?
Edit: I've looked into some sample HPO notebooks that AWS provides and they seem to confuse me more. Sometimes they'd use argparser
to pass in the HPs. How is that passed into the training code?
Upvotes: 1
Views: 1031
Reputation: 266
So i finally figured it out and had it wrong all the time.
The file /opt/ml/input/config/hyperparameters.json
is there. It just has slightly different content compared to a regular training-job. The params to be tuned as well as static params are contained there. As well as the metric-name.
So here is the structure, i hope it helps:
{
'_tuning_objective_metric': 'your-metric',
'dynamic-param1': '0.3',
'dynamic-param2': '1',
'static-param1': 'some-value',
'static-paramN': 'another-value'
}
Upvotes: 2
Reputation: 5568
If you bring your own container, you should consider pip installing SageMaker Training Toolkit. This will allow you to receive hyperparameters as command line arguments to your training script (to be processed with argparser
). This will save you the need to Read and parse the /opt/ml/input/config/hyperparameters.json
yourself.
Upvotes: 0