Repository for my masters' thesis about adaptive pretraining on fact-checking tasks.
-
Updated
Sep 11, 2023 - Jupyter Notebook
Repository for my masters' thesis about adaptive pretraining on fact-checking tasks.
Predicting mutation effects based on deep mutational scanning data
Code for the paper: "Using Pre-training and Interaction Modeling for ancestry-specific disease prediction using multiomics data from the UK Biobank"
An overview of German transformer models related to language modeling in the field of Natural Language Processing.
An end-to-end masked contrastive video-and-language pre-training framework
🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
Overview of self-supervised video representation learning methods.
Pretraining GPT2 model on Basque language
Official repo for Directional Self-supervised Learning for Heavy Image Augmentations [CVPR2022]
StackMathQA: A Curated Collection of 2 Million Mathematical Questions and Answers Sourced from Stack Exchange
Collected corpus for named entity recognition pre-training
Code for reproducing the paper Improved Multilingual Language Model Pretraining for Social Media Text via Translation Pair Prediction to appear at The 7th Workshop on Noisy User-generated Text (W-NUT) organized at EMNLP 2021.
This Repository contains an illustration of Object detection using a pre trained model of Yolov3 with the help of Coco dataset we could label the objects.
Official implementation of DPFM @ ICLR 2024 paper "Augmenting Math Word Problems via Iterative Question Composing"(https://arxiv.org/abs/2401.09003)
Fork of Official Implementation of Meta-Learning to Improve Pre-Training, NeurIPS'21 Poster. (https://arxiv.org/abs/2111.01754)
[NeurIPS 2022] DRAGON 🐲: Deep Bidirectional Language-Knowledge Graph Pretraining
Pretraining Techniques for Graph Transformers
Very incomplete right now, pretrained ARGVAET system for generating, classifying, and predicting the properties of molecules. I couldn't upload the dataset or checkpoints due to size constraints.
Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.
To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."