Detailed Settings for Learning
PHYSBO performs hyperparameter learning (“learning”) for the Gaussian process, but the learning method itself has hyperparameters such as the learning rate for Adam.
Use physbo.misc.SetConfig to change these hyperparameters.
In most cases, you do not need to change the default values.
Usage of SetConfig
First, create an instance with config = physbo.misc.SetConfig(). All parameters have default values, so modify only the parameters you want to change.
Pass the created config to the config argument of the Policy constructor.
For example, to set Adam’s learning rate to 0.01:
config = physbo.misc.SetConfig()
config.learning.alpha = 0.01
policy = physbo.search.discrete.Policy(test_X, config=config)
You can also save parameters to an INI file and load them with config.load("config.ini").
For example, create an INI file like the following:
[learning.adam]
alpha = 0.01
Configurable Parameters
The INI file has a hierarchical structure of sections and keys.
Accordingly, SetConfig also has a hierarchical structure.
For example, the alpha key in the [learning.adam] section corresponds to the config.learning.alpha member variable (note that the variable name is learning rather than learning.adam).
Details of sections and parameters are as follows.
[learning] – Common learning settings
Common settings for Gaussian process regression model learning.
Currently, the learning method method supports three types: adam, bfgs, and batch.
Adam is for online learning; bfgs and batch are for batch learning.
Key |
Type |
Default |
Description |
|---|---|---|---|
|
str |
adam |
Learning method. One of |
|
bool |
True |
Whether to display during learning. One of |
|
int |
10 |
Display interval (how many iterations between displays) |
|
int |
20 |
Number of iterations for initial hyperparameter search |
[learning.online] – Common online learning settings
Common settings for online learning.
Used when method=adam.
Key |
Type |
Default |
Description |
|---|---|---|---|
|
int |
500 |
Maximum number of epochs |
|
int |
50 |
Maximum number of epochs for initial hyperparameter search |
|
int |
64 |
Batch size |
|
int |
5000 |
Number of samples for evaluation |
[learning.adam] – Adam learning
Adam settings.
Used when method=adam.
Key |
Type |
Default |
Description |
|---|---|---|---|
|
float |
0.001 |
Learning rate |
|
float |
0.9 |
Decay rate for first moment |
|
float |
0.999 |
Decay rate for second moment |
|
float |
1e-6 |
Small constant for numerical stability |
[learning.batch] – Batch / BFGS learning
Settings for bfgs and batch.
Used when method=bfgs or method=batch.
Key |
Type |
Default |
Description |
|---|---|---|---|
|
int |
200 |
Maximum number of iterations |
|
int |
20 |
Maximum number of iterations for initial hyperparameter search |
|
int |
5000 |
Batch size |
[search] – Parameters used in Bayesian optimization
Key |
Type |
Default |
Description |
|---|---|---|---|
|
int |
20 |
Used when |
|
float |
1.0 |
Thompson Sampling hyperparameter. Coefficient to rescale the standard deviation of the posterior distribution during sampling |