Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use standard initialization for PSO instead of full real space sampling #467

Merged
merged 3 commits into from
Jan 23, 2020

Conversation

jrapin
Copy link
Contributor

@jrapin jrapin commented Jan 19, 2020

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and Context / Related issue

PSO initialization was very different from other algorithm: sampling in [0,1] and transforming to the real space with an arctan, instead of using a normal distribution. With this diff, PSO uses the sampling of the parametrization, which by default is a normal distribution, like other algorithms.
Closes #296

How Has This Been Tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CONTRIBUTING).
  • All tests passed, and additional code has been covered with new tests.

@jrapin jrapin requested a review from teytaud January 19, 2020 17:21
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jan 19, 2020
@teytaud
Copy link
Contributor

teytaud commented Jan 19, 2020

PSO was performing great on deceptive, the question is whether the new version will be great as well; I can check this for you maybe ?

@jrapin
Copy link
Contributor Author

jrapin commented Jan 19, 2020

PSO was performing great on deceptive, the question is whether the new version will be great as well; I can check this for you maybe ?

Yes it's important to check, I expect PSO will not be as good: if it's good currently, it may have been because the initialization was completely different to other algorithms', so not a fair comparison

@teytaud
Copy link
Contributor

teytaud commented Jan 20, 2020

As I'm underwater I propose that we merge and we'll see in the next global runs if PSO is still dominant in categories of problems for which it is dominant - or we really wait a little bit, I have papers to finish :-)

Regarding fair comparisons - if results with the new initialization are worse, then maybe we must keep a version with the old initialization somewhere.

@jrapin
Copy link
Contributor Author

jrapin commented Jan 23, 2020

@teytaud now:

  • PSO uses standard initialization
  • WidePSO uses the old initialization
  • all chainPSO use the standard PSO (with new initialization)

Should I modify all chainPSO to use the "old" initialization with WidePSO?

@jrapin
Copy link
Contributor Author

jrapin commented Jan 23, 2020

Should I modify all chainPSO to use the "old" initialization with WidePSO?

approved as it is by Olivier

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Difficulty: Low Priority: Low Status: On hold Type: Bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PSO has unusually high population std
3 participants