ICLR 2024 论文和开源项目合集
-
Updated
May 19, 2024
ICLR 2024 论文和开源项目合集
This project involves analyzing and classifying the BoolQ dataset from the SuperGLUE benchmark. We implemented various classifiers and techniques, including rules-based logic, BERT, RNN, and GPT-3/4 data augmentation, achieving performance improvements.
SkyPilot: Run LLMs, AI, and Batch jobs on any cloud. Get maximum savings, highest GPU availability, and managed execution—all with a simple interface.
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/
Corpus2GPT: A project enabling users to train their own GPT models on diverse datasets, including local languages and various corpus types, using Keras and compatible with TensorFlow, PyTorch, or JAX backends for subsequent storage or sharing.
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
Low-code framework for building custom LLMs, neural networks, and other AI models
The Arcee client for executing domain-adpated language model routines
collection of text2cypher datasets, evaluations, and finetuning instructions
Examples for using the SiLLM framework for training and running Large Language Models (LLMs) on Apple Silicon
An efficient, flexible and full-featured toolkit for fine-tuning large models (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
Nvidia GPU exporter for prometheus using nvidia-smi binary
DLRover: An Automatic Distributed Deep Learning System
Linux LiveCD for offline AI training and inference.
Collection of best practices, reference architectures, model training examples and utilities to train large models on AWS.
Fast modular code to create and train cutting edge LLMs
Backend for the AI-copilot
Sequence Parallel Attention for Long Context LLM Model Training and Inference
Add a description, image, and links to the llm-training topic page so that developers can more easily learn about it.
To associate your repository with the llm-training topic, visit your repo's landing page and select "manage topics."