Use Google Gemma with Gradio Chat
-
Updated
Feb 24, 2024 - Python
Use Google Gemma with Gradio Chat
Bangla News Summarization with Gemma-7b (Instruct)
Gemma is a family of lightweight, state-of-the-art open models built from the same research and technology used to create the Gemini models.
Gemma-2b-it LLM has been finetuned on a dataset of Python codes, enabling it to proficiently learn Python syntax and assist in debugging tasks, offering valuable guidance to programmers.
Generative AI take on the classic Hangman game
[KGC '24] This application is for visualisation of Knowledge Graphs. We employe a novel technique which uses LLM based agent for triple extraction from unstructured text. It also got accepted at Text2KG 2024 (ESWC). However, it has better prompting strategy to carry. This tool's backend can be considered as an extension.
Explore the power of Gemma model with GemGPT, a project leveraging AI for innovative solutions. Join us in shaping the future of AI!
Generative models nano version for fun. No STOA here, nano first.
API-specificaties voor gemeenten ten behoeve van de registratie en ontsluiting van de verwerkingsactiviteiten die een gemeente hanteert bij het verwerken van persoonsgegevens.
Neuron is a conversational AI model using the Gemma LLM by Google from Hugging Face. It is designed to engage in a variety of topics and provide information on a wide range of subjects. With its ability to learn and adapt, this chatbot can provide a unique and engaging experience.
Add a description, image, and links to the gemma topic page so that developers can more easily learn about it.
To associate your repository with the gemma topic, visit your repo's landing page and select "manage topics."