Local & Open Source AI Tools
Run AI models locally on your hardware. Privacy-first, offline-capable, open-source tools for self-hosted inference.
Ollama
Run LLMs locally with one command — the easiest way to get AI running on your machine.
llama.cpp
The C/C++ engine powering local AI — lightning-fast inference that Ollama and LM Studio build on.
LM Studio
Beautiful desktop app for running LLMs locally — discover, download, and chat with AI models.
Open WebUI
Self-hosted ChatGPT-style interface for Ollama and OpenAI-compatible APIs.
vLLM
High-throughput LLM serving engine — the production standard for GPU inference at scale.
Jan
Open-source ChatGPT alternative that runs 100% offline on your computer.
text-generation-webui
The Swiss Army knife of local AI — Gradio interface supporting every model format and backend.
GPT4All
Free, local, privacy-aware AI — run chatbots on consumer hardware with no GPU required.
LocalAI
Self-hosted OpenAI-compatible API — drop-in replacement for cloud AI in your infrastructure.
KoboldAI
AI-powered creative writing suite — the go-to tool for interactive fiction and storytelling.
Showing 1–10 of 10 tools