Pretraining on 2015, 2019 and IDRIDs with ResNet 101 and 152 and fine tuning with 2019 dataset only
-
Updated
Dec 9, 2019 - Python
Pretraining on 2015, 2019 and IDRIDs with ResNet 101 and 152 and fine tuning with 2019 dataset only
Dynamic Transfer Learning for Low-Resource Neural Machine Translation
Emergent Communication Pretraining for Few-Shot Machine Translation
This is a flexible class for training specific layers of deep neural-nets in an online manner. Supports Keras models.
Research Code for NeurIPS 2020 Spotlight paper "Large-Scale Adversarial Training for Vision-and-Language Representation Learning": UNITER adversarial training part
An overview of German transformer models related to language modeling in the field of Natural Language Processing.
N-Gram Graph: Simple Unsupervised Representation for Graphs, NeurIPS'19 (https://arxiv.org/abs/1806.09206)
It is very useful for preparing data set of machine learning.
Code for generating a single image pretraining dataset
AAAI-20 paper: Cross-Lingual Natural Language Generation via Pre-Training
Research code for EMNLP 2020 paper "HERO: Hierarchical Encoder for Video+Language Omni-representation Pre-training"
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Official codes: Self-Supervised Learning by Estimating Twin Class Distribution
Code for reproducing the paper Improved Multilingual Language Model Pretraining for Social Media Text via Translation Pair Prediction to appear at The 7th Workshop on Noisy User-generated Text (W-NUT) organized at EMNLP 2021.
XSRL (eXploratory State Representation Learning)
Python source code for EMNLP 2020 paper "Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT".
Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.
To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."