Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training setting for single-task (i.e., semantic segmentation in ade20k) in painter #45

Open
yixuan730 opened this issue Jun 28, 2023 · 6 comments

Comments

@yixuan730
Copy link

Hi, what are the specific training settings for single-task of semantic segmentation in ade20k?
Now, our settings are: batch size=8 (per gpu), accumulate iterations=16, nodes=2, base lr=1e-3 (also tried actual lr=1e-3 which not worked), epoch=300, warmup epoch=20, layer decay=0.8.
However, this hyper-params setting is not worked. Is there something we're missing?
Thank you very much!

@yixuan730
Copy link
Author

ps: we use the mae_pretrain_vit_base.pth instead of large, is the base model need different hyper-params?

@essunny310
Copy link

I also trained Painter for single-task of semantic segmentation in ADE20K.
My setting is: batch size=1 (per GPU), accumulate iteration=16. Other parameters are not modified. The model was trained on a single machine with 4 GPUs (RTX 3090).

  • After 15 epochs (which is the default setting), the mIoU I got is 9.3
  • After 100 epochs, the mIoU I got is 32.2

Hi @yixuan730, could you please report the value you got? Thanks!

Hi @WXinlong, I was wondering if I missed something or the model was sensitive to batch size. Thank you so much!

@TXH-mercury
Copy link

@essunny310 similar situation.
batch size 2 * accum 16 * cards 8, 15 epoch, lr=1e-3 miou12.8
batch size 2 * accum 16 * cards 8, 15 epoch, lr=1e-4 miou
32
The visualization of these two settings are similar: the color in each patch are totally the same and shows no details.
I am trying with bigger batch size and bigger epoch.

@LIUTAOGE
Copy link

LIUTAOGE commented Aug 4, 2023

train adek20 没有效果,请问解决了吗,谢谢

@GZ-YourZY
Copy link

我通
Snipaste_2023-12-12_13-26-24
我通过训练得到的pth文件并不能正确推理,而是返回这样的图片,请问这是因为什么原因

@mrwu-mac
Copy link

mrwu-mac commented Mar 5, 2024

@essunny310 similar situation. batch size 2 * accum 16 * cards 8, 15 epoch, lr=1e-3 miou12.8 batch size 2 * accum 16 * cards 8, 15 epoch, lr=1e-4 miou32 The visualization of these two settings are similar: the color in each patch are totally the same and shows no details. I am trying with bigger batch size and bigger epoch.

I got the same problem, do u have new results?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants