MOE
  • Why Do We Need MOE?
    • Other Methods
  • Install
    • Install in docker
    • Install from source
    • OSX Tips
    • Building Boost
    • Linux Tips
    • CMake Tips
    • Python Tips
  • How does MOE work?
    • Build a Gaussian Process (GP) with the historical data
    • Optimize the hyperparameters of the Gaussian Process
    • Find the point(s) of highest Expected Improvement (EI)
    • Return the point(s) to sample, then repeat
  • Demo Tutorial
    • The Interactive Demo
  • Pretty Endpoints
  • Objective Functions
    • What is an objective function?
    • Properties of an objective function
    • Parameters
    • \(\Phi\) Objective Functions
    • Example of Objective Functions
  • Multi-Armed Bandits
    • What is the multi-armed bandit problem?
    • Applications
    • Policies
    • Pointers
  • Examples
    • Minimizing an arbitrary function
    • Gaussian Process regression given historical data
    • Hyperparameter optimization of a Gaussian Process
    • All above examples combined
    • Setting thresholds for advertising units
  • Contributing
    • Making a pull request
    • Documentation
    • Testing
    • Style
    • Versioning
    • Releasing (For Maintainers)
  • Frequently Asked Questions
    • What license is MOE released under?
    • When should I use MOE?
    • What is the time complexity of MOE?
    • How do I cite MOE?
    • Why does MOE take so long to return the next points to sample for some inputs?
    • How do I bootstrap MOE? What initial data does it need?
    • How many function evaluations do I need before MOE is “done”?
    • How many function evaluations do I perform before I update the hyperparameters of the GP?
    • Will you accept my pull request?
  • moe package
    • Subpackages
    • Submodules
    • moe.resources module
    • Module contents
  • moe_examples package
    • Subpackages
    • Submodules
    • moe_examples.bandit_example module
    • moe_examples.blog_post_example_ab_testing module
    • moe_examples.combined_example module
    • moe_examples.hyper_opt_of_gp_from_historical_data module
    • moe_examples.mean_and_var_of_gp_from_historic_data module
    • moe_examples.next_point_via_simple_endpoint module
    • Module contents
  • C++ Files
    • gpp_optimization_test
    • gpp_domain_test
    • gpp_expected_improvement_gpu
    • gpp_heuristic_expected_improvement_optimization_test
    • gpp_linear_algebra_test
    • gpp_geometry
    • gpp_heuristic_expected_improvement_optimization
    • gpp_linear_algebra-inl
    • gpp_test_utils
    • gpp_logging
    • gpp_covariance
    • gpp_python_test
    • gpp_domain
    • gpp_python_common
    • gpp_hyperparameter_optimization_demo
    • gpp_geometry_test
    • gpp_math_test
    • gpp_cuda_math
    • gpp_python_model_selection
    • gpp_math
    • gpp_random_test
    • gpp_optimizer_parameters
    • gpp_expected_improvement_demo
    • gpp_optimization
    • gpp_test_utils_test
    • gpp_linear_algebra
    • gpp_python_expected_improvement
    • gpp_exception
    • gpp_model_selection
    • gpp_random
    • gpp_covariance_test
    • gpp_mock_optimization_objective_functions
    • gpp_python
    • gpp_model_selection_test
    • gpp_hyper_and_EI_demo
    • gpp_python_gaussian_process
    • gpp_common
    • gpp_expected_improvement_gpu_test
 
MOE
  • Docs »


© Copyright 2012-2014 Yelp. MOE is licensed under the Apache License, Version 2.0: http://www.apache.org/licenses/LICENSE-2.0.

Sphinx theme provided by Read the Docs