gpp_hyperparameter_optimization_demo¶
Contents:
gpp_hyperparameter_optimization_demo.cpp¶
moe/optimal_learning/cpp/gpp_hyperparameter_optimization_demo.cpp
This is a demo for the model selection (via hyperparameter optimization) capability present in this project. These capabilities live in gpp_model_selection.
In gpp_expected_improvement_demo, we choose the hyperparameters arbitrarily. Here, we will walk through an example of how one would select hyperparameters for a given class of covariance function; here, SquareExponential will do. This demo supports:
- User-specified training data
- Randomly generated training data (more automatic)
More details on the second case:
- Choose a set of hyperparameters randomly: source covariance
- Build a fake* training set by drawing from a GP with source covariance, at randomly chosen locations * By defining OL_USER_INPUTS to 1, you can specify your own input data.
- Choose a new random set of hyperparameters and run hyperparameter optimization
- Show log likelihood using the optimized hyperparameters AND the source hyperparameters
- observe that with larger training sets, the optimized hyperparameters converge to the source values; but in smaller sets other optima may exist
Further notes about [newton] optimization performance and robustness are spread throughout the demo code, placed near the function call/object construction that they are relevant to.
Please read and understand gpp_expected_improvement_demo.cpp before going through this example. In addition, understanding gpp_model_selection.hpp’s file comments (as well as cpp for devs) is prerequisite.
DefinesFunctionsOL_USER_INPUTS
int main()