Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hyper-parameters used in algorithm 2 #8

Open
Tsingularity opened this issue Jun 29, 2023 · 1 comment
Open

hyper-parameters used in algorithm 2 #8

Tsingularity opened this issue Jun 29, 2023 · 1 comment

Comments

@Tsingularity
Copy link

Hi, thanks for the great work!

Just curious what're the hyper-parameters used in the algorithm 2 (image below)? For example, how to set the learning rate and repeat time for each time step? I passed the paper but didn't find any detailed about this. Could you please share them? Thanks!

image
@vvictoryuki
Copy link
Owner

@Tsingularity Thank you for recognizing our work! The hyperparameters in the algorithm are crucial for achieving good results. Regarding the repeat time, in our experiments, a higher number of repetitions usually does not harm the generation quality. Therefore, it is possible to search for a value that balances the computational cost and the quality of the generated results. However, the specific repeat time may vary depending on the dataset and conditions. As for setting the learning rate, it is a key factor in obtaining stable and high-quality generation results. The empirical strategies for its setting are relatively complex, and we are currently working on open-sourcing and sharing this part of the strategy (before the end of July). Please stay tuned for our future updates;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants