Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

a bug of VLT5TokenizerFast #21

Open
Neo-Zhangjiajie opened this issue Apr 12, 2022 · 1 comment
Open

a bug of VLT5TokenizerFast #21

Neo-Zhangjiajie opened this issue Apr 12, 2022 · 1 comment

Comments

@Neo-Zhangjiajie
Copy link

Neo-Zhangjiajie commented Apr 12, 2022

When I use VLT5TokenizerFast to encode the sentence, there will be a token id 3 ( '▁') before id of token <extra_id_i>. For example,

from lib2to3.pgen2 import token
from tokenization import VLT5Tokenizer, VLT5TokenizerFast
from transformers import T5Tokenizer, BartTokenizer, T5TokenizerFast, BartTokenizerFast
from copy import deepcopy
import torch

tokenizer = VLT5TokenizerFast.from_pretrained(
            't5-base',
            max_length=20,
            do_lower_case=False,
            )

text = "I <extra_id_0> you."
input_ids = tokenizer.encode(text)
decoded_text = tokenizer.decode(input_ids)
print(text)
print(input_ids)
print(decoded_text)
print(tokenizer.convert_ids_to_tokens([3]))


(base) zhangjiajie@node2:~/VL-T5-Incontext/VL-T5-Incontext/src$ python test.py
The tokenizer class you load from this checkpoint is not the same type as the class this function is called from. It may result in unexpected tokenization. 
The tokenizer class you load from this checkpoint is 'T5Tokenizer'. 
The class this function is called from is 'VLT5TokenizerFast'.
I <extra_id_0> you.
[27, 3, 32099, 25, 5, 1]
I <extra_id_0> you.</s>
['▁']

`

If I just use T5tokenizerFast, it is ok, and the output is

(base) zhangjiajie@node2:~/VL-T5-Incontext/VL-T5-Incontext/src$ python test.py
I <extra_id_0> you.
[27, 32099, 25, 5, 1]
I<extra_id_0> you.</s>
['▁']

Is there any solution? Thanks!

@j-min
Copy link
Owner

j-min commented Aug 19, 2022

Could you please check the version of the transformers package? With transformers=4.2.1 (mentioned in requirements.txt), both tokenizers yield the same results:

I <extra_id_0> you.
[27, 32099, 25, 5, 1]
I<extra_id_0> you.</s>
['']

This was referenced Jul 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants