Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Single label pretraining on in21k using ViT #45

Open
cissoidx opened this issue Aug 31, 2021 · 0 comments
Open

Single label pretraining on in21k using ViT #45

cissoidx opened this issue Aug 31, 2021 · 0 comments

Comments

@cissoidx
Copy link

Hi, I have seen that you have updated single label pretraining script on in21k. This is really great work. I have some questions about pretraining ViT:

  1. The default setting is for tresnet_m, do you have the configs for vit-b-16? Or it is actually the same?
  2. What is the accuracy of the validation set in single label pretraining? In the table of your readme file, I see that using semantic loss, vit reaches 77.6% and further finetuning on in1k reaches 84.4%. But what about single label pretrained models?

cheers,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant