Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Information theory criteria for design space exploration #267

Open
ArnoStrouwen opened this issue Apr 12, 2021 · 2 comments
Open

Information theory criteria for design space exploration #267

ArnoStrouwen opened this issue Apr 12, 2021 · 2 comments

Comments

@ArnoStrouwen
Copy link
Member

Taken from a discussion with @ludoro on SciML slack:

As an alternative to acquisition functions, such as expected improvement, information theory based criteria could be used to pick new points to evaluate the objective at. Such as Fisher information or Shannon information. Some of these methods are described in:
https://link.springer.com/chapter/10.1007/978-1-4939-8847-1_6
These information theory based methods have mostly been used to improve prediction variance over the entire design space, but recently they have also been used to find the maximum of the objective.
https://www.tandfonline.com/doi/abs/10.1080/16843703.2018.1542965

@atiyo
Copy link

atiyo commented Aug 3, 2021

To take this idea a bit further: the whole notion of efficient data acquisition for surrogates is essentially the same as active learning.From this perspective, one might dip into the active learning literature for a whole set of applicable techniques (including the information theoretic driven ones).

Edit: somehow submitted comment before I finished typing!

@ArnoStrouwen
Copy link
Member Author

https://bayesoptbook.com/
Overview of many interesting techniques such as entropy search.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants