Skip to content
#

distilling-the-knowledge

Here are 6 public repositories matching this topic...

Language: All
Filter by language

We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation an…

  • Updated Aug 10, 2020
  • Python

Improve this page

Add a description, image, and links to the distilling-the-knowledge topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distilling-the-knowledge topic, visit your repo's landing page and select "manage topics."

Learn more