Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About "finetune_norm" #143

Open
unnamed1024 opened this issue Jan 18, 2024 · 1 comment
Open

About "finetune_norm" #143

unnamed1024 opened this issue Jan 18, 2024 · 1 comment

Comments

@unnamed1024
Copy link

Thanks for your code! First, I trained about 300,000 epochs, but the psnr was consistently between 15 and 17. I wanted to do some fine-tuning and noticed that there was a section in the code about fine-tuning transformer parameters. But when I set "finetune_norm": to "true" in the configuration file, I didn't find the transformer parameter in the parameter list.
What should I do if I want to improve my psnr?Looking forward to your answer.

@One1209
Copy link

One1209 commented Apr 11, 2024

300,000 epochs?

Or 300,000 iterations???

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants