Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Global-local auto tune algorithm #121

Open
kazewong opened this issue May 26, 2023 · 1 comment
Open

Global-local auto tune algorithm #121

kazewong opened this issue May 26, 2023 · 1 comment

Comments

@kazewong
Copy link
Owner

Currently one of the biggest obstacles to a smoother user experience is the absence of auto-tuning capability, so the user has to spend quite a lot of time tuning the configuration parameters such as the number of loops and the number of local/global steps.

There is no good rules of thumb and good metric to inform the users on how to tune the algorithm more than increase the number of chains or run it longer.

There are two steps toward a better user experience,

  1. Give metrics to inform the user the quality of the run, such as monitoring local R_hat vs global R_hat. The user can then tune the configuration parameters
  2. Lay down auto tune strategy that take cares of the tuning automatically.
@kazewong
Copy link
Owner Author

It seems the global acceptance rate do increase somewhat monotonically over time, which could be a great metric to track the quality of the run.

It could be useful to just set the target global acceptance rate and set an earlier stopping criteria for this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant