Releases: OpenInterpreter/open-interpreter
Vision I (Quick Fixes II)
- An issue with UNIX files has been resolved (#748)
- Experimental support for Python in
--vision
mode has been added
Full Changelog: v0.1.13...v0.1.14
Vision I (Quick Fixes I)
Quick fix for --vision
support Windows. File paths should now be properly recognized and loaded into the model.
Full Changelog: v0.1.12...v0.1.13
Vision I
A quick one, a fun one. Added experimental vision
support for OpenAI users.
interpreter --vision
Drag files / screenshots into your terminal to use it. Also supports reflection for HTML. (It can see the designs it produces!)
Vision II will introduce support for reflective vision in many more languages.
What's Changed
- Get more system info in debug mode by @Notnaton in #701
- Quickfix .lower() by @Notnaton in #744
- Unsupported lang fix by @CyanideByte in #745
New Contributors
- @CyanideByte made their first contribution in #745
Full Changelog: v0.1.11...v0.1.12
Local II Update
- Local mode is now powered by LM Studio. Running
--local
will tell you how to setup LM Studio + connect to it automatically. - It's way smaller. Removed the MASSIVE local embedding model, chromadb, oobabooga, a bunch of other packages we didn't really need. Semgrep is now optional.
- The system message is tighter, so it's cheaper + faster on any LLM.
Several crashes have also been resolved, temperature is now properly set to 0 (which should increase performance on OpenAI models), Powershell on Linux support, an ugly print statement was removed, we're now enforcing a consistent code style (black
, isort
), and much more:
What's Changed
- Fix a crash in get_relevant_procedures_string by @Notnaton in #676
- Fixed issue with 3.5 Models hallucinating a function called "python", causing empty returns by @Cobular in #683
- Fixed Issue #661 by @shubhexists in #687
- Consider hiding the api-key by @Notnaton in #682
- Remove space from language by @Notnaton in #680
- Update README.md by @Shivam250702 in #677
- docs: fix typo in docs/WINDOWS.md by @suravshresth in #658
- Update MACOS.md by @ihgalis in #471
- Update README.md with ref to MACOS.md by @ihgalis in #472
- Clarity updates to README.md by @shruti222patel in #674
- fix: make sure safe_mode is disabled by default by @ericrallen in #705
- Update README.md to say Mistral 7B by @Maclean-D in #690
- fix: default to 0 if temperature is unset by @ericrallen in #710
- chore: add better docs for safe_mode; load semgrep if available by @ericrallen in #709
- chore: remove extraneous chroma package; clean up dependency list by @ericrallen in #704
- fix: make all local models run through the html entity buffer by @ericrallen in #703
- Fix GNU readline support for user input by @cowile in #522
- docs: fix markdown formatting; better explain %tokens by @ericrallen in #695
- fix: clarify that %tokens estimates token usage for next request by @ericrallen in #694
- Proposal: Enforce consistent code style with black, isort, and pre-commit by @ericrallen in #699
- Fix for failing Powershell code execution on Linux by @thefazzer in #729
- Update README.md for sample FastAPI by @AndrewNgo-ini in #727
- Update MACOS.md: fix command by @pjpjq in #725
- update Readme_ZH.md by @HashCookie in #712
New Contributors
- @Notnaton made their first contribution in #676
- @Cobular made their first contribution in #683
- @shubhexists made their first contribution in #687
- @Shivam250702 made their first contribution in #677
- @suravshresth made their first contribution in #658
- @shruti222patel made their first contribution in #674
- @cowile made their first contribution in #522
- @thefazzer made their first contribution in #729
- @AndrewNgo-ini made their first contribution in #727
- @pjpjq made their first contribution in #725
- @HashCookie made their first contribution in #712
Full Changelog: v0.1.10...v0.1.11
Great work everyone!
v0.1.10
Bug fixes, pinned LiteLLM to prevent printed stream issue.
What's Changed
- Fix "depracated" typo by @jamiew in #642
- Fix issue #635 by @leifktaylor in #643
- Fix typo in setup_text_llm.py by @eltociear in #632
- Fix indentation in language_map.py by @smwyzi in #648
New Contributors
- @jamiew made their first contribution in #642
- @leifktaylor made their first contribution in #643
- @smwyzi made their first contribution in #648
Full Changelog: v0.1.9...v0.1.10
v0.1.9
The (Mini) Hackathon Update
The Open Interpreter Hackathon is on. To make OI easier to build on, we decided to add some developer features, such as exposing Open Procedures via interpreter.procedures
.
This lets you use RAG (retrieval augmented generation) to teach Open Interpreter new things.
Learn more about these new developer features via this Colab Notebook.
Full Changelog: v0.1.8...v0.1.9
The Local Update (Part I)
Open Interpreter's --local
mode is now powered by Mistral 7B
.
Significantly more architectures supported locally via ooba
, a headless Oobabooga wrapper.
What's Changed
- Fix bug when trying to use local non-CodeLlama model by @alexweberk in #571
- Update README_ZH.md by @orangeZSCB in #563
- chore: update test suite by @ericrallen in #594
- Fixed a bug in setup_text_llm.py by @kylehh in #560
- feat: add %tokens magic command that counts tokens via tiktoken by @ericrallen in #607
- feat: add support for loading different config.yaml files by @ericrallen in #609
- feat: add optional prompt token/cost estimate to %tokens by @ericrallen in #614
- Added powershell language by @DaveChini in #620
- Local Update by @KillianLucas in #625
New Contributors
- @alexweberk made their first contribution in #571
- @orangeZSCB made their first contribution in #563
- @kylehh made their first contribution in #560
- @DaveChini made their first contribution in #620
Full Changelog: v0.1.7...v0.1.8
v0.1.7
Generator Update (Quick Fixes II)
Particularly for Windows users and the new --config
flag.
We also added @ericrallen's --scan
flag, but this is not the official release for that. We'll direct attention to it on a subsequent release.
What's Changed
- Skip wrap_in_trap on Windows by @goalkeepr in #548
- fix: allow args to have choices and defaults by @ericrallen in #511
- feat: add semgrep code scanning via -safe argument by @ericrallen in #484
- fix: stop overwriting safe_mode config.yaml setting with default in args by @ericrallen in #554
New Contributors
- @goalkeepr made their first contribution in #548
Full Changelog: v0.1.6...v0.1.7
v0.1.6
Generator Update (Quick Fixes I)
What's Changed
- fix: stop overwriting boolean config values by @ericrallen in #508
- Update WINDOWS.md by @rsfutch77 in #523
- Fix ARM64 llama-cpp-python Install on Apple Silicon by @gavinmclelland in #505
- Broken empty message response by @blujus in #501
- fix crash on unknwon command on call to display help message by @mocy in #493
- Update get_relevant_procedures.py by @kubla in #492
New Contributors
- @ericrallen made their first contribution in #508
- @rsfutch77 made their first contribution in #523
- @gavinmclelland made their first contribution in #505
- @blujus made their first contribution in #501
- @mocy made their first contribution in #493
- @kubla made their first contribution in #492
Full Changelog: v0.1.5...v0.1.6
The Generator Update
Features
- Modular, generator-based foundation (rewrote entire codebase)
- Significantly easier to build Open Interpreter into your applications via
interpreter.chat(message)
(see JARVIS for example implementation) - Run
interpreter --config
to configureinterpreter
to run with any settings by default (set your default language model, system message, etc) - Run
interpreter --conversations
to resume conversations - Budget manager (thank you LiteLLM!) via
interpreter --max_budget 0.1
(sets max budget per session in USD) - Change the system message, temperature, max_tokens, etc. from the command line
- Central
/conversations
folder for persistent memory - New hosted language models (thank you LiteLLM!) like Claude, Google PaLM, Cohere, and more.
What's Changed
- Fix typo 'recieved'> 'received' by @merlinfrombelgium in #361
- Pull request template created by @TanmayDoesAI in #365
- docs: move pr template to .github folder by @jordanbtucker in #373
- chore: enhance .gitignore by @jordanbtucker in #374
- chore: add vscode debug support by @jordanbtucker in #375
- discard the / as command as it will block the Mac/Linux to load the file by @moming2k in #378
- Update interpreter.py for a typo error by @YUFEIFUT in #397
- Translated Open Interpreter README into Hindi by @zeelsheladiya in #417
- Add models to pull request template by @mak448a in #423
- Retry connecting to openai after hitting rate limit to fix #442 by @mathiasrw in #452
- Handle %load_message failure in interpreter.py by @richawo in #431
- add budget manager for api calls by @krrishdholakia in #316
- The Generator Update by @KillianLucas in #482
New Contributors
- @YUFEIFUT made their first contribution in #397
- @zeelsheladiya made their first contribution in #417
- @mak448a made their first contribution in #423
- @mathiasrw made their first contribution in #452
- @richawo made their first contribution in #431
- @krrishdholakia made their first contribution in #316
Full Changelog: v0.1.4...v0.1.5