Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.
-
Updated
Jun 11, 2024 - TypeScript
Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.
Unify Efficient Fine-Tuning of 100+ LLMs
Chat locally using leading open models built by the community, optimized and accelerated by NVIDIA's enterprise-ready inference runtime
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
On-device LLM Inference Powered by X-Bit Quantization
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
Python SDK for agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
A NodeJS RAG framework to easily work with LLMs and embeddings
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
Using Langchain, HuggingFace et Mistral to create a RAG that is able to analyse PDF from the Enedis documentation website, and answer question about it. Ongoing project
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
🤯 Lobe Chat - an open-source, modern-design ChatGPT/LLMs UI/Chat Framework. Supports speech-synthesis, multi-modal, and extensible plugin system. One-click FREE deployment of your private ChatGPT/Gemini/Ollama chat application.
All-in-one AI CLI tool that integrates 20+ AI platforms, including OpenAI, Azure-OpenAI, Gemini, Claude, Mistral, Cohere, VertexAI, Bedrock, Ollama, Ernie, Qianwen, Deepseek...
[ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
Deploy serverless LLM from Azure marketplace using bicep.
Add a description, image, and links to the mistral topic page so that developers can more easily learn about it.
To associate your repository with the mistral topic, visit your repo's landing page and select "manage topics."