Skip to content

Releases: OpenInterpreter/open-interpreter

Vision I (Quick Fixes II)

11 Nov 10:34
Compare
Choose a tag to compare
Pre-release
  • An issue with UNIX files has been resolved (#748)
  • Experimental support for Python in --vision mode has been added

Full Changelog: v0.1.13...v0.1.14

Vision I (Quick Fixes I)

11 Nov 06:16
Compare
Choose a tag to compare
Pre-release

Quick fix for --vision support Windows. File paths should now be properly recognized and loaded into the model.

Full Changelog: v0.1.12...v0.1.13

Vision I

10 Nov 22:35
Compare
Choose a tag to compare
Vision I Pre-release
Pre-release

A quick one, a fun one. Added experimental vision support for OpenAI users.

interpreter --vision

Drag files / screenshots into your terminal to use it. Also supports reflection for HTML. (It can see the designs it produces!)

Vision II will introduce support for reflective vision in many more languages.

What's Changed

New Contributors

Full Changelog: v0.1.11...v0.1.12

Local II Update

09 Nov 16:52
Compare
Choose a tag to compare
Local II Update Pre-release
Pre-release
  • Local mode is now powered by LM Studio. Running --local will tell you how to setup LM Studio + connect to it automatically.
  • It's way smaller. Removed the MASSIVE local embedding model, chromadb, oobabooga, a bunch of other packages we didn't really need. Semgrep is now optional.
  • The system message is tighter, so it's cheaper + faster on any LLM.

Several crashes have also been resolved, temperature is now properly set to 0 (which should increase performance on OpenAI models), Powershell on Linux support, an ugly print statement was removed, we're now enforcing a consistent code style (black, isort), and much more:

What's Changed

New Contributors

Full Changelog: v0.1.10...v0.1.11

Great work everyone!

v0.1.10

19 Oct 01:13
Compare
Choose a tag to compare
v0.1.10 Pre-release
Pre-release

Bug fixes, pinned LiteLLM to prevent printed stream issue.

What's Changed

New Contributors

Full Changelog: v0.1.9...v0.1.10

v0.1.9

12 Oct 08:08
Compare
Choose a tag to compare
v0.1.9 Pre-release
Pre-release

The (Mini) Hackathon Update


The Open Interpreter Hackathon is on. To make OI easier to build on, we decided to add some developer features, such as exposing Open Procedures via interpreter.procedures.

This lets you use RAG (retrieval augmented generation) to teach Open Interpreter new things.

Learn more about these new developer features via this Colab Notebook.


Full Changelog: v0.1.8...v0.1.9

The Local Update (Part I)

11 Oct 23:20
Compare
Choose a tag to compare
Pre-release
Banner

Open Interpreter's --local mode is now powered by Mistral 7B.

Significantly more architectures supported locally via ooba, a headless Oobabooga wrapper.

What's Changed

New Contributors

Full Changelog: v0.1.7...v0.1.8

v0.1.7

29 Sep 04:30
Compare
Choose a tag to compare
v0.1.7 Pre-release
Pre-release

Generator Update (Quick Fixes II)

Particularly for Windows users and the new --config flag.

We also added @ericrallen's --scan flag, but this is not the official release for that. We'll direct attention to it on a subsequent release.

What's Changed

New Contributors

Full Changelog: v0.1.6...v0.1.7

v0.1.6

27 Sep 02:58
Compare
Choose a tag to compare
v0.1.6 Pre-release
Pre-release

Generator Update (Quick Fixes I)

What's Changed

New Contributors

Full Changelog: v0.1.5...v0.1.6

The Generator Update

24 Sep 02:33
Compare
Choose a tag to compare
The Generator Update Pre-release
Pre-release

Generator Update 5

Features

  • Modular, generator-based foundation (rewrote entire codebase)
  • Significantly easier to build Open Interpreter into your applications via interpreter.chat(message) (see JARVIS for example implementation)
  • Run interpreter --config to configure interpreter to run with any settings by default (set your default language model, system message, etc)
  • Run interpreter --conversations to resume conversations
  • Budget manager (thank you LiteLLM!) via interpreter --max_budget 0.1 (sets max budget per session in USD)
  • Change the system message, temperature, max_tokens, etc. from the command line
  • Central /conversations folder for persistent memory
  • New hosted language models (thank you LiteLLM!) like Claude, Google PaLM, Cohere, and more.

What's Changed

New Contributors

Full Changelog: v0.1.4...v0.1.5