-
Notifications
You must be signed in to change notification settings - Fork 497
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add some informations for SDXL training #431
base: main
Are you sure you want to change the base?
Conversation
https://github.com/kohya-ss/sd-scripts/blob/main/docs/train_README-zh.md#%E4%BC%98%E5%8C%96%E5%99%A8%E7%9B%B8%E5%85%B3优化器相关
关于指定优化器使用 --optimizer_args 选项指定优化器选项参数。可以以key=value的格式指定多个值。此外,您可以指定多个值,以逗号分隔。例如,要指定 AdamW 优化器的参数, 指定可选参数时,请检查每个优化器的规格。 D-Adaptation 优化器自动调整学习率。学习率选项指定的值不是学习率本身,而是D-Adaptation决定的学习率的应用率,所以通常指定1.0。如果您希望 Text Encoder 的学习率是 U-Net 的一半,请指定 自动调整的选项类似于 如果您不想自动调整学习率,请添加可选参数 使用任何优化器使用 (内部仅通过 importlib 未确认操作。如果需要,请安装包。) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
坏了刚刚不小心改了
Mainly from
xformers 是否会较 原生torch 增加显存占用/降低 我不确定,所以并未写入,其他新增注释测试个人无误
我自己使用v1.8.3训练时xformer反而增加了显存消耗