Skip to content

Releases: Coobiw/MiniGPT4Qwen

MPP-Qwen14B pretrain/sft data and pretrain ckpt(linear projection weight)

14 Mar 19:16
Compare
Choose a tag to compare

pretrain:

  • model.pth: the linear projection weight after the MPP-Qwen14B Pretrain Stage.

llava_instruction_100k:

  • complex_reasoning_77k and details_23k of llava_instruction_tuning data. I've converted it into MPP-Qwen14B format.

llava_pretrain_558k:

  • 558K pretrain data of LLaVA. I've converted it into MPP_Qwen14B format.

instruction-data, checkpoint and logs(also 14B model)

24 Oct 17:39
Compare
Choose a tag to compare

Release the instruction data to align MiniGPT4 to Qwen-Chat LLM model and my checkpoint(all 10 epochs trained by lavis/projects/instruction_tuning/train.yaml).

Release MiniGPT4Qwen14B model checkpoint and train logs(20 epochs trained by lavis/projects/pp_qwen14b/train_pp.yaml). The files are compressed in pp_14b_ckpt-logs.zip.