Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enh/optimizer #6

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Enh/optimizer #6

wants to merge 2 commits into from

Conversation

jakobj
Copy link
Owner

@jakobj jakobj commented May 29, 2018

This PR introduces pytorch optimizers to snes to update mu of the search distribution. instead of doing plain stochastic gradient descent, this opens the possibility to use any of the optimizers available in pytorch (https://pytorch.org/docs/stable/optim.html)

@mschmidt87 please have a look and let me know your thoughts on this

Copy link
Contributor

@mschmidt87 mschmidt87 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice. Would be nice if torch.autograd could automatically compute natural gradients.

optimizer_mu.step()

# manually update sigma
sigma *= np.exp(learning_rate_sigma / 2. * np.dot(utility, s ** 2 - 1))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense here to update log(sigma) with the pytorch-optimizer? Or is that a bad idea due to numerical stability issues?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants