This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
-
Updated
Jan 12, 2021 - Python
This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
Bringing local LLMs to a Minecraft front-end through commands.
LLM Kit - Python Large Language Model Kit for generating data of your choice
Large Multi-Language Models for News Translation
AccIo - Enterprise LLM : Unifying intelligence at your command!
Python-based WebSocket for CLI LLaVA inference.
Effortlessly create and manage your own AI infrastructure with Radiantloom AI. Privacy, security, and flexibility meet ease-of-use in this innovative open-source platform.
Mamba for Vision, Perception and Action
Detailed code explanation of google LLM gemini
In this workshop, we demonstrate how to choose the right container and right instance types, optimize container parameters, and set up the right autoscaling policies and how to use APIs to get recommendations with Amazon SageMaker
How to stream LLM responses using AWS API Gateway Websockets and Lambda
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
This python app generates NIST 800 53 control implementation for each control and generate the CSV file.
👻 Experimental library for scraping websites using OpenAI's GPT API.
clone on chatgpt usign html css js and flask
Super easy to use library for doing LLaMA/GPT-J stuff! - Mirror of: https://gitlab.com/niansa/libjustlm
A collection of examples for training or fine-tuning LLMs.
Code to benchmark APIs available from LLM vendors and demostrate how they work
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."