From cb9cd299af91303ec340f7bbee76f5897cdb3732 Mon Sep 17 00:00:00 2001 From: c-bata Date: Thu, 23 Apr 2020 13:58:17 +0900 Subject: [PATCH] Update Katib document to describe algorithms and suggestions --- .../components/hyperparameter-tuning/experiment.md | 11 ++++++----- .../components/hyperparameter-tuning/katib-config.md | 8 +++++++- 2 files changed, 13 insertions(+), 6 deletions(-) diff --git a/content/en/docs/components/hyperparameter-tuning/experiment.md b/content/en/docs/components/hyperparameter-tuning/experiment.md index d4fc497a15..528122bfce 100644 --- a/content/en/docs/components/hyperparameter-tuning/experiment.md +++ b/content/en/docs/components/hyperparameter-tuning/experiment.md @@ -148,7 +148,7 @@ descriptions on this page: * [Random search](#random-search) * [Bayesian optimization](#bayesian) * [HYPERBAND](#hyperband) -* [Hyperopt TPE](#tpe-search) +* [Tree of Parzen Estimators (TPE)](#tpe-search) * [NAS based on reinforcement learning](#nas) More algorithms are under development. You can add an algorithm to Katib @@ -181,7 +181,8 @@ sampling without replacement. Random search is therefore the best algorithm to use when combinatorial exploration is not possible. If the number of continuous variables is high, you should use quasi random sampling instead. -Katib uses the [hyperopt](http://hyperopt.github.io/hyperopt/) optimization +Katib uses the [hyperopt](http://hyperopt.github.io/hyperopt/) or +[Goptuna](https://github.com/c-bata/goptuna) optimization framework for its random search. Katib supports the following algorithm settings: @@ -303,13 +304,13 @@ thus for maximizing the number of configurations that it can evaluate. HYPERBAND also focuses on the speed of the search. -#### Hyperopt TPE +#### Tree of Parzen Estimators (TPE) The algorithm name in Katib is `tpe`. Katib uses the Tree of Parzen Estimators (TPE) algorithm in -[hyperopt](http://hyperopt.github.io/hyperopt/). This method provides a -[forward and reverse gradient-based](https://arxiv.org/pdf/1703.01785.pdf) +[hyperopt](http://hyperopt.github.io/hyperopt/) or [goptuna](https://github.com/c-bata/goptuna). +This method provides a [forward and reverse gradient-based](https://arxiv.org/pdf/1703.01785.pdf) search. diff --git a/content/en/docs/components/hyperparameter-tuning/katib-config.md b/content/en/docs/components/hyperparameter-tuning/katib-config.md index 819ab8383f..e583feaff4 100644 --- a/content/en/docs/components/hyperparameter-tuning/katib-config.md +++ b/content/en/docs/components/hyperparameter-tuning/katib-config.md @@ -114,7 +114,13 @@ All of these settings except **`image`** can be omitted. If you don't specify an 1. `image` - Docker image name for the `random` suggestion. - **Must be specified**. + **Must be specified**. You can specify one of the following images: + + - `suggestion-chocolate`: [chocolate](https://github.com/AIworx-Labs/chocolate) based suggestion service which supports `grid`, `chocolate-random`, `chocolate-quasirandom`, `chocolate-bayesian-optimization` and `chocolate-mocmaes`. + - `suggestion-goptuna`: [Goptuna](https://github.com/c-bata/goptuna) based suggestion service which supports `cmaes`, `tpe` and `random`. + - `suggestion-hyperband`: [HpBandSter](https://github.com/automl/HpBandSter) based suggestion service which supports `hyperband`. + - `suggestion-hyperopt`: [hyperopt](https://github.com/hyperopt/hyperopt) based suggestion service which supports `tpe` and `random`. + - `suggestion-skopt`: [scikit-optimize](https://github.com/scikit-optimize/scikit-optimize) based suggestion service which supports `bayesianoptimization`. 1. `imagePullPolicy` - `Random` suggestion container [image pull policy](https://kubernetes.io/docs/concepts/configuration/overview/#container-images).