Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some weights of the model checkpoint were not used when initializing MGMLlamaForCausalLM #103

Open
charlesCXK opened this issue Apr 30, 2024 · 2 comments

Comments

@charlesCXK
Copy link

When loading the pretrained ckeckpoint after stage 2, I get the following warning:

Some weights of the model checkpoint at /train_logs/MGM/MGM-7B/mgm_v7b_336_hr_768_stage2 were not used when initializing MGMLlamaForCausalLM: ['model.vision_tower.vision_tower.vision_model.encoder.layers.9.mlp.fc2.bias', 'model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.out_proj.weight', 'model.vision_tower.vision_tower.vision_model.encoder.layers.9.layer_norm2.weight', 'model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.k_proj.bias', 'model.vision_tower_aux.vision_stages.1.blocks.1.norm.weight', 'model.vision_tower_aux.vision_stages.2.blocks.4.mlp.fc1.weight', 'model.vision_tower.vision_tower.vision_model.encoder.layers.11.layer_norm2.bias'

However, when loading the ckeckpoint provided in the github, there is no warning.
Could you give me some advices on this?

Thanks!

@yanwei-li
Copy link
Member

Hi, please provide more details to locate the error, like the detailed script and settings for training.

@charlesCXK
Copy link
Author

Hi, I am using the official script without modifications (https://github.com/dvlab-research/MGM/blob/main/scripts/llama/train/stage_1_2_full_v7b_336_hr_768.sh).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants