JanAI 17.8k
Open-source ChatGPT alternative that runs 100% offline on your computer.
ExLlama 3.0k
A fast inference library for running LLMs locally on modern consumer-class GPUs.
ShellGPT 8.3k
A command-line productivity tool powered by AI large language models like GPT-4, will help you accomplish your tasks faster and more efficiently.
Anything LLM 12.5k
Open-source ChatGPT equivalent experience for both open and close source LLMs, embedders, and vector databases.
Dialoqbase 1.4k
Dialoqbase is an open-source application designed to facilitate the creation of custom chatbots using a personalized knowledge base.
Docs GPT 14.2k
DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in the project documentation.
gpt4all 64.8k
Open-source large language models that run locally on your CPU and nearly any GPU.
Kobold cpp Rocm 276
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading.
Kobold cpp 3.8k
KoboldCpp is an easy-to-use AI text-generation software for GGML and GGUF models.
Llama2 WebUI 1.9k
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac).
LM Studio 741
LM Studio is a desktop application for discovering and running Large Language Models locally.
Local AI 524
Local AI is a desktop app for local, private, secured AI experimentation.
Maid 68
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama models remotely.
Mudler LocalAGI 310
LocalAGI is a small virtual assistant that you can run locally, made by the LocalAI author and powered by it.
Mudler LocalAI 19.9k
Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required.
Ollama 63.3k
Ollama is LLMs Backend that allow you to get up and running with large language models locally.
Serge 5.5k
Serge is a chat interface crafted with llama.cpp for running GGUF models. No API keys, entirely self-hosted.
Text Gen WebUI 36.5k
Oobabooga Text Generation WebUI is a Gradio browser interface for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
Llama.cpp 57.4k
The main goal of llama.cpp is to run the LLaMA model using 4-bit integer quantization on a MacBook.