Skip to content

Ribin-Baby/detoxify_text_t5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

finetuned T5 detoxifier


  • we are using T5-base as our base model.
  • finetuned on paradetox dataset from huggingface.
  • This finetuned model is available to download from Huggingface : link1 or link2
  • trained on kaggle environment took 5 hours to train the model and achieved good results.
  • This is instruction based finetuning not applying PEFT here.
  • during data preparation adding instructions as prefix is a good practice.
    • example: input: "Toxic version: i didnt vote for the liar" , output: "Non-toxic version: I didn't vote for him"

About

finetuning t5-base model for detoxifying texts.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published