Skip to content

Latest commit

 

History

History
82 lines (47 loc) · 3.74 KB

README.md

File metadata and controls

82 lines (47 loc) · 3.74 KB

This repo is only for v1 and supports Bilibil and YouTube!

🤖 BibiGPT: one-Click AI Summary for Audio/Video & Chat with Learning Content https://bibigpt.co

🎉 Effortlessly summarize YouTube and Bilibili videos with our AI-driven Video Summarizer. It also works for Podcasts, Twitter, Meetings, Lectures, Tiktok videos, and more. Discover a more brilliant way to learn with ChatGPT, your best AI-powered study companion! (formerly BiliGPT) "stream-saving artifact & class representative".

Alternate address: https://b.jimmylv.cn Browser extension: https://bibigpt.co/extension


🤖 BibiGPT · AI 音视频内容一键总结 & 对话 https://bibigpt.co

🎉 ChatGPT AI 音视频一键总结,轻松学习哔哩哔哩丨 YouTube 丨本地视频丨本地音频丨播客丨小红书丨抖音丨会议丨讲座丨网页等任意内容。BibiGPT 助力于成为最好的 AI 学习助理,支持免费试用!(原 BiliGPT 省流神器 & AI 课代表)(支持 iOS 快捷指令 & 微信服务号)。

备用地址:https://b.jimmylv.cn 浏览器插件: https://bibigpt.co/extension


🎬 This project summarizes YouTube/Bilibili/Twitter/TikTok/Podcast/Lecture/Meeting/... videos or audios for you using AI.

🤯 Inspired by Nutlope/news-summarizer & zhengbangbo/chat-simplifier & lxfater/BilibiliSummary

BibiGPT音视频总结神器

🚀 First Launch: 【BibiGPT】AI 自动总结 B 站视频内容,GPT-3 智能提取并总结字幕

How it works

This project uses the OpenAI ChatGPT API (specifically, gpt-3.5-turbo) and Vercel Edge functions with streaming and Upstash for Redis cache and rate limiting. It fetches the content on a Bilibili video, sends it in a prompt to the GPT-3 API to summarize it via a Vercel Edge function, then streams the response back to the application.

Saving costs

Projects like this can get expensive so in order to save costs if you want to make your own version and share it publicly, I recommend three things:

  • 1. Implement rate limiting so people can't abuse your site
  • 2. Implement caching to avoid expensive AI re-generations
  • 3. Use text-curie-001 instead of text-dacinci-003 in the summarize edge function

Running Locally

After cloning the repo, go to OpenAI to make an account and put your API key in a file called .env.

Then, run the application in the command line and it will be available at http://localhost:3000.

specific running procedure is described in this document - Chinese version

npm run dev

Deployment

Deploy the example using Vercel

Setup the env variables, by following the ./example.env file.

Support Docker

#133

# make sure setup .env file firstly
docker compose up -d

Support -> Contact Me

Star History

Star History Chart

Contributors

This project exists thanks to all the people who contribute.