moe.tests.optimal_learning.python package¶
Subpackages¶
- moe.tests.optimal_learning.python.cpp_unit_tests package
- moe.tests.optimal_learning.python.cpp_wrappers package
- Submodules
- moe.tests.optimal_learning.python.cpp_wrappers.exception_test module
- moe.tests.optimal_learning.python.cpp_wrappers.expected_improvement_test module
- moe.tests.optimal_learning.python.cpp_wrappers.gaussian_process_test module
- moe.tests.optimal_learning.python.cpp_wrappers.log_likelihood_test module
- moe.tests.optimal_learning.python.cpp_wrappers.optimization_test module
- Module contents
- moe.tests.optimal_learning.python.python_version package
- Submodules
- moe.tests.optimal_learning.python.python_version.covariance_test module
- moe.tests.optimal_learning.python.python_version.expected_improvement_test module
- moe.tests.optimal_learning.python.python_version.log_likelihood_test module
- moe.tests.optimal_learning.python.python_version.optimization_test module
- Module contents
Submodules¶
moe.tests.optimal_learning.python.comparison_test module¶
Tests for the functions/classes in comparison.py.
- class moe.tests.optimal_learning.python.comparison_test.ComparableTestObject(args, property_offset=0, function_offset=0)[source]¶
Bases: moe.optimal_learning.python.comparison.EqualityComparisonMixin
Object for testing equality comparisons.
- class moe.tests.optimal_learning.python.comparison_test.NotComparableObject[source]¶
Bases: object
Object with == and != disabled.
moe.tests.optimal_learning.python.gaussian_process_test_case module¶
Base test case for tests that manipulate Gaussian Process data and supporting structures.
- class moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestCase[source]¶
Bases: moe.tests.optimal_learning.python.optimal_learning_test_case.OptimalLearningTestCase
Base test case for tests that want to use random data generated from Gaussian Process(es).
Users are required to set the class variable precompute_gaussian_process_data flag and define class variables: gp_test_environment_input and num_sampled_list (see base_setup() docstring).
Using that info, base_setup will create the required test cases (in gp_test_environments) for use in testing.
The idea is that base_setup is run once per test class, so the (expensive) work of building GPs can be shared across numerous individual tests.
- classmethod base_setup()[source]¶
Build a Gaussian Process prior for each problem size in cls.num_sampled_list if precomputation is desired.
Requires
- cls.num_sampled_list: (list of int) problem sizes to consider
- cls.gp_test_environment_input: (GaussianProcessTestEnvironmentInput) specification of how to build the gaussian process prior
Outputs
- cls.gp_test_environments: (list of GaussianProcessTestEnvironment) gaussian process data for each of the specified problem sizes (cls.num_sampled_list)
- dim = 3¶
- gp_test_environment_input = <moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironmentInput object at 0x119471f90>¶
- noise_variance_base = 0.0002¶
- num_hyperparameters = 4¶
- num_sampled_list = (1, 2, 3, 5, 10, 16, 20, 42)¶
- num_to_sample_list = (1, 2, 3, 8)¶
- precompute_gaussian_process_data = False¶
- class moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironment[source]¶
Bases: moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironment
An object for representing a (randomly generated) Gaussian Process.
Variables: - domain – (interfaces.domain_interface.DomainInterface subclass) domain the GP was built on
- gaussian_process – (interfaces.gaussian_process_interface.GaussianProcessInterface subclass) the constructed GP
- class moe.tests.optimal_learning.python.gaussian_process_test_case.GaussianProcessTestEnvironmentInput(dim, num_hyperparameters, num_sampled, noise_variance_base=0.0, hyperparameter_interval=ClosedInterval(min=0.2, max=1.3), lower_bound_interval=ClosedInterval(min=-2.0, max=0.5), upper_bound_interval=ClosedInterval(min=2.0, max=3.5), covariance_class=<class 'moe.optimal_learning.python.python_version.covariance.SquareExponential'>, spatial_domain_class=<class 'moe.optimal_learning.python.python_version.domain.TensorProductDomain'>, hyperparameter_domain_class=<class 'moe.optimal_learning.python.python_version.domain.TensorProductDomain'>, gaussian_process_class=<class 'moe.optimal_learning.python.python_version.gaussian_process.GaussianProcess'>)[source]¶
Bases: object
A test environment for constructing randomly generated Gaussian Process priors within GaussianProcessTestCase.
This is used only in testing. The intended use-case is that subclasses of GaussianProcessTestCase (below) will define one of these objects, and then GaussianProcessTestCase has some simple logic to precompute the requested GaussianProcess-derived test case(s).
moe.tests.optimal_learning.python.gaussian_process_test_utils module¶
Utilities for generating domains, hyperparameters of covariance, and gaussian processes; useful primarily for testing.
By default, the functions in this file use the Python optimal_learning library (python_version package). Users can override this behavior with any implementation of the ABCs in the interfaces package.
- moe.tests.optimal_learning.python.gaussian_process_test_utils.build_random_gaussian_process(points_sampled, covariance, noise_variance=None, gaussian_process_type=<class 'moe.optimal_learning.python.python_version.gaussian_process.GaussianProcess'>)[source]¶
Utility to draw points_sampled.shape[0] points from a GaussianProcess prior, add those values to the GP, and return the GP.
This is mainly useful for testing or when “random” data is needed that will produce reasonably well-behaved GPs.
Parameters: - points_sampled (array of float64 with shape (num_sampled, dim)) – points at which to draw from the GP
- covariance (interfaces.covariance_interface.CovarianceInterface subclass composable with gaussian_process_type) – covariance function backing the GP
- noise_variance (array of float64 with shape (num_sampled)) – the \sigma_n^2 (noise variance) associated w/the new observations, points_sampled_value
- gaussian_process_type (interfaces.gaussian_process_interface.GaussianProcessInterface subclass) – gaussian process whose historical data is being set
Returns: a gaussian process with the generated prior data
Return type: gaussian_process_type object
- moe.tests.optimal_learning.python.gaussian_process_test_utils.fill_random_covariance_hyperparameters(hyperparameter_interval, num_hyperparameters, covariance_type=<class 'moe.optimal_learning.python.python_version.covariance.SquareExponential'>)[source]¶
Generate random hyperparameters (drawn uniformly from the input interval) and returns a covariance object with those hyperparameters.
This is mainly useful for testing or when “random” data is needed so that we get more varied cases than hyperparameters = 1.0.
Parameters: - hyperparameter_interval (ClosedInterval) – range, [min, max], from which to draw the hyperparameters
- num_hyperparameters (int > 0) – number of hyperparameters
- covariance_type (interfaces.covariance_interface.CovarianceInterface subclass) – covariance function whose hyperparameters are being set
Returns: covariance_type instantiated with the generated hyperparameters
Return type: covariance_type object
- moe.tests.optimal_learning.python.gaussian_process_test_utils.fill_random_domain_bounds(lower_bound_interval, upper_bound_interval, dim)[source]¶
Generate a random list of dim [min_i, max_i] pairs.
The data is organized such that: min_i \in [lower_bound_interval.min, lower_bound_interval.max] max_i \in [upper_bound_interval.min, upper_bound_interval.max]
This is mainly useful for testing or when “random” data is needed so that we get more varied cases than the unit hypercube.
Parameters: - lower_bound_interval (ClosedInterval) – an uniform range, [min, max], from which to draw the domain lower bounds, min_i
- upper_bound_interval (ClosedInterval) – an uniform range, [min, max], from which to draw the domain upper bounds, max_i
- dim (int > 0) – the spatial dimension of a point (i.e., number of independent params in experiment)
Returns: ClosedInterval objects with their min, max members initialized as described
Return type: list of ClosedInterval
moe.tests.optimal_learning.python.geometry_utils_test module¶
Tests for the functions/classes in geometry_utils.
- class moe.tests.optimal_learning.python.geometry_utils_test.TestClosedInterval[source]¶
Bases: moe.tests.optimal_learning.python.optimal_learning_test_case.OptimalLearningTestCase
Tests for ClosedInterval’s member functions.
- class moe.tests.optimal_learning.python.geometry_utils_test.TestGridPointGeneration[source]¶
Bases: moe.tests.optimal_learning.python.optimal_learning_test_case.OptimalLearningTestCase
Test the generation of an evenly spaced, axis-aligned grid on a hypercube.
- class moe.tests.optimal_learning.python.geometry_utils_test.TestLatinHypercubeRandomPointGeneration[source]¶
Bases: moe.tests.optimal_learning.python.optimal_learning_test_case.OptimalLearningTestCase
Test moe.optimal_learning.python.geometry_utils.generate_latin_hypercube_points.
http://en.wikipedia.org/wiki/Latin_hypercube_sampling
- From wikipedia:
In the context of statistical sampling, a square grid containing sample positions is a Latin square if (and only if) there is only one sample in each row and each column. A Latin hypercube is the generalisation of this concept to an arbitrary number of dimensions, whereby each sample is the only one in each axis-aligned hyperplane containing it.
When sampling a function of N variables, the range of each variable is divided into M equally probable intervals. M sample points are then placed to satisfy the Latin hypercube requirements; note that this forces the number of divisions, M, to be equal for each variable. Also note that this sampling scheme does not require more samples for more dimensions (variables); this independence is one of the main advantages of this sampling scheme. Another advantage is that random samples can be taken one at a time, remembering which samples were taken so far.
- test_latin_hypercube_equally_spaced()[source]¶
Test that generate_latin_hypercube_points returns properly spaced points.
Sampling from a latin hypercube results in a set of points that in each dimension are drawn uniformly from sub-intervals of the domain this tests that every sub-interval in each dimension contains exactly one point.
moe.tests.optimal_learning.python.linkers_test module¶
Tests that linkers contain all possible types defined in constants.
- class moe.tests.optimal_learning.python.linkers_test.TestLinkers[source]¶
Bases: object
Tests that linkers contain all possible types defined in constants.
- test_covariance_links_have_all_covariance_types()[source]¶
Test each covariance type is in a linker, and every linker key is a covariance type.
- test_domain_links_have_all_domain_types()[source]¶
Test each domain type is in a linker, and every linker is a domain type.
moe.tests.optimal_learning.python.optimal_learning_test_case module¶
Base test case class for optimal_learning tests; includes some additional asserts for numerical tests.
TODO(GH-175): Generalize ping testing code used in some derivative tests (e.g., covariance, log likelihood pinging) to be more DRY (model after C++ test cases). We can set up one ping tester and just pass it objective functions.
- class moe.tests.optimal_learning.python.optimal_learning_test_case.OptimalLearningTestCase[source]¶
Bases: object
Base test case for the optimal_learning library.
This includes extra asserts for checking relative differences of floating point scalars/vectors and a routine to check that points are distinct.
- static assert_points_distinct(point_list, tol)[source]¶
Check whether the distance between every pair of points is larger than tolerance.
Parameters: - point_list (array of float64 with shape (num_points, dim)) – points to check
- tol (float64) – the minimum allowed (absolute) distance between points
Raise: AssertionError when every point is not more than tolerance distance apart
- static assert_scalar_within_absolute(value, truth, tol)[source]¶
Check whether a scalar value is equal to truth: |value - truth| <= tol.
Parameters: - value (float64) – scalar to check
- truth – exact/desired result
- tol (float64) – max permissible absolute difference
Raise: AssertionError value, truth are not equal to within tolerance
- static assert_scalar_within_relative(value, truth, tol)[source]¶
Check whether a scalar value is relatively equal to truth: |value - truth|/|truth| <= tol.
Parameters: - value (float64) – scalar to check
- truth – exact/desired result
- tol (float64) – max permissible relative difference
Raise: AssertionError value, truth are not relatively equal
- static assert_vector_within_relative(value, truth, tol)[source]¶
Check whether a vector is element-wise relatively equal to truth: |value[i] - truth[i]|/|truth[i]| <= tol.
Parameters: - value (array of float64 with shape matching value) – vector to check
- truth – exact/desired vector result
- tol (float64) – max permissible relative difference
Raise: AssertionError value[i], truth[i] are not relatively equal for every i
Module contents¶
Testing code for the (Python) optimal_learning library.
Testing is done via the Testify package: https://github.com/Yelp/Testify
This package includes:
- Test cases/test setup files
- Tests for classes and utils in moe.optimal_learning.python
- Tests for classes and functions in moe.optimal_learning.python.python_version
- Tests for classes and functions in moe.optimal_learning.python.cpp_wrappers
Files in this package
- moe.tests.optimal_learning.python.optimal_learning_test_case: base test case for optimal_learning tests with some extra asserts for checking relative differences of floats (scalar, vector)
- moe.tests.optimal_learning.python.gaussian_process_test_case: test case for tests that manipulate GPs, includes extra logic to construct random gaussian process priors; meant to provide a well-behaved source of random data to unit tests.
- moe.tests.optimal_learning.python.gaussian_process_test_utils: utilities for constructing a random domain, covariance, and GaussianProcess
- moe.tests.optimal_learning.python.geometry_utils_test: tests for moe.optimal_learning.python.geometry_utils
Subpackages
- moe.tests.optimal_learning.python.python_version: tests for the Python implementation of optimal_learning. These include some manual checks, ping tests, and some high level integration tests. Python testing is currently relatively sparse; we rely heavily on the C++ comparison.
- moe.tests.optimal_learning.python.cpp_wrappers: tests that check the equivalence of the C++ implementation and the Python implementation of optimal_learning (where applicable). Also runs the C++ unit tests.