Skip to content

Repo accompanying our paper "Do Llamas Work in English? On the Latent Language of Multilingual Transformers".

Notifications You must be signed in to change notification settings

epfl-dlab/llm-latent-language

Repository files navigation

Logit lens plot colab

Try out your own prompts here.

Installation

Set up a python environment and run pip install -r requirements.txt

Usage

Translation

papermill Translation.ipynb out.ipynb -p input_lang fr -p target_lang zh

Cloze

papermill Cloze.ipynb out.ipynb -p target_lang fr

Precomputed latents

For your convenience, we also provide some precomputed latents on huggingface. Here are some preliminary steering experiments using the precomputed latents.

Acknowledgements

Starting point of this repo was Nina Rimsky's Llama-2 wrapper.

Citation

@article{wendler2024llamas,
  title={Do Llamas Work in English? On the Latent Language of Multilingual Transformers},
  author={Wendler, Chris and Veselovsky, Veniamin and Monea, Giovanni and West, Robert},
  journal={arXiv preprint arXiv:2402.10588},
  year={2024}
}

About

Repo accompanying our paper "Do Llamas Work in English? On the Latent Language of Multilingual Transformers".

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published