Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing the text in the azure notebook from Llama 2 to Llama. #478

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Use Azure API with Llama 2\n",
"# Use Azure API with Llama\n",
"\n",
"This notebook shows examples of how to use Llama 2 APIs offered by Microsoft Azure. We will cover: \n",
"* HTTP requests API usage for Llama 2 pretrained and chat models in CLI\n",
"* HTTP requests API usage for Llama 2 pretrained and chat models in Python\n",
"This notebook shows examples of how to use Llama APIs offered by Microsoft Azure. We will cover: \n",
"* HTTP requests API usage for Llama pretrained and chat models in CLI\n",
"* HTTP requests API usage for Llama pretrained and chat models in Python\n",
"* Plug the APIs into LangChain\n",
"* Wire the model with Gradio to build a simple chatbot with memory\n",
"\n"
Expand All @@ -20,7 +20,7 @@
"source": [
"## Prerequisite\n",
"\n",
"Before we start building with Azure Llama 2 APIs, there are certain steps we need to take to deploy the models:\n",
"Before we start building with Azure Llama APIs, there are certain steps we need to take to deploy the models:\n",
"\n",
"* Register for a valid Azure account with subscription [here](https://azure.microsoft.com/en-us/free/search/?ef_id=_k_CjwKCAiA-P-rBhBEEiwAQEXhH5OHAJLhzzcNsuxwpa5c9EJFcuAjeh6EvZw4afirjbWXXWkiZXmU2hoC5GoQAvD_BwE_k_&OCID=AIDcmm5edswduu_SEM__k_CjwKCAiA-P-rBhBEEiwAQEXhH5OHAJLhzzcNsuxwpa5c9EJFcuAjeh6EvZw4afirjbWXXWkiZXmU2hoC5GoQAvD_BwE_k_&gad_source=1&gclid=CjwKCAiA-P-rBhBEEiwAQEXhH5OHAJLhzzcNsuxwpa5c9EJFcuAjeh6EvZw4afirjbWXXWkiZXmU2hoC5GoQAvD_BwE)\n",
"* Take a quick look on what is the [Azure AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/what-is-ai-studio?tabs=home) and navigate to the website from the link in the article\n",
Expand Down Expand Up @@ -147,7 +147,7 @@
"source": [
"### Content Safety Filtering\n",
"\n",
"All Azure Llama 2 API endpoints have content safety feature turned on. Both input prompt and output tokens are filtered by this service automatically. \n",
"All Azure Llama API endpoints have content safety feature turned on. Both input prompt and output tokens are filtered by this service automatically. \n",
"To know more about the impact to the request/response payload, please refer to official guide [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/content-filter?tabs=python). \n",
"\n",
"For model input and output, if the filter detects there is harmful content, the generation will error out with a response payload containing the reasoning, along with information on the type of content violation and its severity. \n",
Expand Down Expand Up @@ -323,9 +323,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Use Llama 2 API with LangChain\n",
"## Use Llama API with LangChain\n",
"\n",
"In this section, we will demonstrate how to use Llama 2 APIs with LangChain, one of the most popular framework to accelerate building your AI product. \n",
"In this section, we will demonstrate how to use Llama APIs with LangChain, one of the most popular framework to accelerate building your AI product. \n",
"One common solution here is to create your customized LLM instance, so you can add it to various chains to complete different tasks. \n",
"In this example, we will use the `AzureMLOnlineEndpoint` class LangChain provides to build a customized LLM instance. This particular class is designed to take in Azure endpoint and API keys as inputs and wire it with HTTP calls. So the underlying of it is very similar to how we used `urllib.request` library to send RESTful calls in previous examples to the Azure Endpoint. \n",
"\n",
Expand Down Expand Up @@ -363,7 +363,7 @@
"\n",
"\n",
"class AzureLlamaAPIContentFormatter(ContentFormatterBase):\n",
"#Content formatter for Llama 2 API for Azure MaaS\n",
"#Content formatter for Llama API for Azure MaaS\n",
"\n",
" def format_request_payload(self, prompt: str, model_kwargs: Dict) -> bytes:\n",
" #Formats the request according to the chosen api\n",
Expand Down Expand Up @@ -450,18 +450,18 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"At the time of writing this sample notebook, LangChain doesn't support streaming with `AzureMLOnlineEndpoint` for Llama 2. We are working with LangChain and Azure team to implement that."
"At the time of writing this sample notebook, LangChain doesn't support streaming with `AzureMLOnlineEndpoint` for Llama. We are working with LangChain and Azure team to implement that."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Build a chatbot with Llama 2 API\n",
"## Build a chatbot with Llama API\n",
"\n",
"In this section, we will build a simple chatbot using Azure Llama 2 API, LangChain and [Gradio](https://www.gradio.app/)'s `ChatInterface` with memory capability.\n",
"In this section, we will build a simple chatbot using Azure Llama API, LangChain and [Gradio](https://www.gradio.app/)'s `ChatInterface` with memory capability.\n",
"\n",
"Gradio is a framework to help demo your machine learning model with a web interface. We also have a dedicated Gradio chatbot [example](https://github.com/meta-llama/llama-recipes/blob/main/recipes/use_cases/chatbots/RAG_chatbot/RAG_Chatbot_Example.ipynb) built with Llama 2 on-premises with RAG. \n",
"Gradio is a framework to help demo your machine learning model with a web interface. We also have a dedicated Gradio chatbot [example](https://github.com/meta-llama/llama-recipes/blob/main/recipes/use_cases/chatbots/RAG_chatbot/RAG_Chatbot_Example.ipynb) built with Llama on-premises with RAG. \n",
"\n",
"First, let's install Gradio dependencies.\n"
]
Expand Down Expand Up @@ -508,7 +508,7 @@
"langchain.debug=True\n",
"\n",
"class AzureLlamaAPIContentFormatter(ContentFormatterBase):\n",
"#Content formatter for Llama 2 API for Azure MaaS\n",
"#Content formatter for Llama API for Azure MaaS\n",
"\n",
" def format_request_payload(self, prompt: str, model_kwargs: Dict) -> bytes:\n",
" #Formats the request according to the chosen api\n",
Expand Down