How to include hyperparameter tuning in a TFX pipeline?

TFX pipeline is a really good tool for quick end-to-end model development. However, I'd also like to include hyperparameter tuning before final model training and evaluation.

My question is whether there exists a best practice to include tuning in the pipeline and if so is it publicly available?

Upvotes: 0

Views: 671

Answers (1)

user11530462
user11530462

Reputation:

There is no inbuilt Component available in TFMA or TFX yet for Hyperparameter Tuning. However, there are in built Libraries available in Tensorflow. As per my knowledge, there are 2 ways to do it.

  1. Hyperparameter Tuning and its Visualization in Tensorboard for TF Version 2.0, mentioned by greeness above.

Partial Code Snippet is shown below:

HP_NUM_UNITS = hp.HParam('num_units', hp.Discrete([16, 32]))
HP_DROPOUT = hp.HParam('dropout', hp.RealInterval(0.1, 0.2))
HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))

METRIC_ACCURACY = 'accuracy'

with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
  hp.hparams_config(hparams=[HP_NUM_UNITS, HP_DROPOUT, HP_OPTIMIZER],
    metrics=[hp.Metric(METRIC_ACCURACY, display_name='Accuracy')],)

Refer this link for more details: https://www.tensorflow.org/tensorboard/r2/hyperparameter_tuning_with_hparams

  1. Hyper Parameter Tuning using TF.Estimator. We can set params argument of Estimator as a Dictionary with Keys as the names of Hyperpameters and Values as their respective values. Refer below link for more information.

https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator#init

and

https://github.com/tensorflow/tensorflow/blob/1bf6646b871d0ce601715f8ed2f50430ca504da7/tensorflow/contrib/training/python/training/hparam.py#L310

Upvotes: 1

Related Questions