moe.tests.optimal_learning.python.cpp_wrappers package¶
Submodules¶
moe.tests.optimal_learning.python.cpp_wrappers.exception_test module¶
Tests for the C++-defined Python exception type objects.
moe.tests.optimal_learning.python.cpp_wrappers.expected_improvement_test module¶
Test the C++ implementation of expected improvement against the Python implementation.
- class moe.tests.optimal_learning.python.cpp_wrappers.expected_improvement_test.TestExpectedImprovement[source]¶
Bases: moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestCase
Test C++ vs Python implementations of Expected Improvement.
Currently only checks that the 1D, analytic EI & gradient match. Checking monte carlo would be very expensive (b/c of the need to converge the MC) or very difficult (to make python & C++ use the exact same sequence of random numbers).
- classmethod base_setup()[source]¶
Run the standard setup but seed the RNG first (for repeatability).
It is easy to stumble into test cases where EI is very small (e.g., < 1.e-20), which makes it difficult to set meaningful tolerances for the checks.
- dim = 3¶
- gp_test_environment_input = <moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironmentInput object at 0x11d6199d0>¶
- noise_variance_base = 0.0002¶
- num_hyperparameters = 4¶
- num_sampled_list = (1, 2, 5, 10, 16, 20, 42, 50)¶
- precompute_gaussian_process_data = True¶
moe.tests.optimal_learning.python.cpp_wrappers.gaussian_process_test module¶
Test the C++ implementation of Gaussian Process properties (mean, var, gradients thereof) against the Python version.
- class moe.tests.optimal_learning.python.cpp_wrappers.gaussian_process_test.TestGaussianProcess[source]¶
Bases: moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestCase
Test C++ vs Python implementations of Gaussian Process properties (mean, variance, cholesky variance, and their gradients).
- classmethod base_setup()[source]¶
Run the standard setup but seed the RNG first (for repeatability).
It is easy to stumble into test cases where mean, var terms are very small (e.g., < 1.e-20), which makes it difficult to set meaningful tolerances for the checks.
- precompute_gaussian_process_data = True¶
- test_gp_add_sampled_points_singular_covariance_matrix()[source]¶
Test that GaussianProcess.add_sampled_points indicates a singular covariance matrix when points_sampled contains duplicates (0 noise).
- test_gp_construction_singular_covariance_matrix()[source]¶
Test that the GaussianProcess ctor indicates a singular covariance matrix when points_sampled contains duplicates (0 noise).
- test_python_and_cpp_return_same_cholesky_variance_and_gradient()[source]¶
Compare chol_var/grad chol_var results from Python & C++, checking seeral random points per test case.
- test_python_and_cpp_return_same_mu_and_gradient()[source]¶
Compare mu/grad mu results from Python & C++, checking seeral random points per test case.
moe.tests.optimal_learning.python.cpp_wrappers.log_likelihood_test module¶
Test cases to check that C++ and Python implementations of moe.optimal_learning.python.interfaces.log_likelihood_interface match.
- class moe.tests.optimal_learning.python.cpp_wrappers.log_likelihood_test.TestLogLikelihood[source]¶
Bases: moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestCase
Test that the C++ and Python implementations of the Log Marginal Likelihood match (value and gradient).
- dim = 3¶
- gp_test_environment_input = <moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironmentInput object at 0x11d4c4910>¶
- noise_variance_base = 0.0002¶
- num_hyperparameters = 4¶
- num_sampled_list = (1, 2, 5, 10, 16, 20, 42)¶
- precompute_gaussian_process_data = False¶
moe.tests.optimal_learning.python.cpp_wrappers.optimization_test module¶
Tests for the cpp_wrappers.optimization module.
Currently these objects either inherit from Boost Python objects or are very thin data containers. So tests just verify that objects are created & behave as expected. MOE does not yet support the ability to optimize arbitrary Python functions through C++-coded optimizers.
- class moe.tests.optimal_learning.python.cpp_wrappers.optimization_test.TestOptimizerParameters[source]¶
Bases: object
Test that the various optimizer parameter classes (wrapping C++ objects) work.
- classmethod base_setup()[source]¶
Set up dummy parameters for testing optimization parameter structs.
Module contents¶
Tests checking that results (e.g., log likelihood, expected improvement, gradients thereof) computed by Python and C++ are the same.
These tests are just meant as an extra check for equivalence since we have two “identical” implementations of optimal_learning. The C++ is independently tested (see moe/optimal_learning/cpp/*test.cpp files) as is the Python* (see moe/tests/optimal_learning/python/python_version), so the tests in this package generally are not very exhaustive.
* The C++ tests are much more extensive than the Python tests, which still need substantial development.