Self-Hosted Apps
3apps · local AI compatibility & hardware requirements
Self-hosted AI apps run on your own infrastructure — your hardware, your data, your control. These range from simple single-user desktop apps to multi-user Docker deployments with admin panels and SSO. Running self-hosted means you are not sending data to third-party servers, which is important for privacy-sensitive use cases.
- AnythingLLMAll-in-one local AI workspace with best-in-class RAG. Upload documents, chat with your files, build no-code AI agents, and connect 30+ LLM providers.Runs locally · OpenRouter
- LibreChatEnterprise self-hosted ChatGPT clone with 30+ AI providers. Multi-user admin panel, OAuth2 SSO, artifacts, code interpreter, and MCP support.· OpenRouter
- Open WebUISelf-hosted ChatGPT-like web UI for LLMs. Native Ollama integration, RAG document Q&A, multi-user support, and OpenRouter compatibility.Runs locally · OpenRouter
Want to check if your GPU can run the models these apps need? Use the homepage calculator to see which models fit your hardware with estimated tokens per second.