Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify configurable optimizer API #518

Merged
merged 7 commits into from
Feb 17, 2020
Merged

Simplify configurable optimizer API #518

merged 7 commits into from
Feb 17, 2020

Conversation

jrapin
Copy link
Contributor

@jrapin jrapin commented Feb 17, 2020

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and Context / Related issue

The simplified process means:

  • writting an optimizer with extra parameters (more than just parametrization, budget, num_workers)
  • specifying a ConfiguredOptimizer class, with the appropriate init (the extra parameters) and the right base algorithm + nice docstrings since that will be visible in the doc
  • that's it, but if you want you can pre-create many variants using the configured optimizer class

How Has This Been Tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CONTRIBUTING).
  • All tests passed, and additional code has been covered with new tests.

@jrapin jrapin requested a review from teytaud February 17, 2020 10:36
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Feb 17, 2020
Copy link
Contributor Author

@jrapin jrapin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would something like this be more natural to use?

budget: Optional[int] = None,
num_workers: int = 1,
scale: float = 1.0,
diagonal: bool = False
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Specify extra configuration parameters with defaults here (as you usually do)

@@ -239,8 +247,8 @@ def _internal_provide_recommendation(self) -> ArrayLike:
return self.es.result.xbest # type: ignore


class ParametrizedCMA(base.ParametrizedFamily):
"""TODO
class ParametrizedCMA(base.ConfiguredOptimizer):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then just create a Wrapper class

MilliCMA = ParametrizedCMA(scale=1e-3).with_name("MilliCMA", register=True)
MicroCMA = ParametrizedCMA(scale=1e-6).with_name("MicroCMA", register=True)
CMA = ParametrizedCMA().set_name("CMA", register=True)
DiagonalCMA = ParametrizedCMA(diagonal=True).set_name("DiagonalCMA", register=True)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And decline the different values of the extra parameters

Copy link
Contributor Author

@jrapin jrapin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also updated an example you are familiar with: SplitOptimizer3,5,9,13 :D

) -> None:
super().__init__(parametrization, budget=budget, num_workers=num_workers)
if num_vars:
if num_optims:
if num_vars is not None:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@teytaud make if num_vars can lead to unexpected behavior if what you wanted is to test agains None, so always use if num_vars is not None

num_vars: Optional[List[Any]] = None
) -> None:
super().__init__(parametrization, budget=budget, num_workers=num_workers, num_optims=num_optims, num_vars=num_vars)
SplitOptimizer3 = ConfSplitOptimizer(num_optims=3).set_name("SplitOptimizer3", register=True)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is all it takes now to have all the variants

@jrapin
Copy link
Contributor Author

jrapin commented Feb 17, 2020

Agreed by @teytaud , merging and following with other PRs to propagage

@jrapin jrapin merged commit 825a0bf into master Feb 17, 2020
@jrapin jrapin deleted the configured branch February 17, 2020 11:51
@jrapin jrapin mentioned this pull request Feb 18, 2020
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants