Skip to content

Latest commit

 

History

History
44 lines (37 loc) · 4.82 KB

checkpoints.md

File metadata and controls

44 lines (37 loc) · 4.82 KB

Checkpoints

We provide links for you to download our checkpoints, including pretrained and finetuned models on different tasks. If you would like to use OFA with Transformers, please download checkpoints at https://huggingface.co/OFA-Sys, and check the code in the branch feature/add_transformers.

Pretraining

Finetuning (OFA-Huge)

Finetuning (OFA-Large)

Finetuning (OFA-Base)

Pretrained Language Models

To follow our multimodal pretraining, we suggest using pretrained language models for the initialization. Note that for the base-size and large-size models, we directly use BART-base and BART-large, and for the other sizes, we pretrained the tiny-size, medium size, and huge-size OFA-based language models.