CanItRun Logocanitrun.

AI Apps & Agents Compatibility

Find AI apps, coding agents, chat frontends, and local LLM tools that work with your hardware. Compare local support, OpenRouter compatibility, recommended models, VRAM requirements, and setup difficulty.

This directory covers 20 AI applications — from coding agents like Cline and Aider to chat frontends like Open WebUI and SillyTavern, to local inference engines like Ollama and llama.cpp. For each app, we tell you whether it runs locally, which models work best, what GPU you need, and whether it supports OpenRouter, Ollama, or OpenAI-compatible APIs.

Unlike OpenRouter's token-usage rankings, we focus on hardware compatibility: can your GPU run this app with the models it needs? Use the homepage calculator to check specific GPU-model combinations, or browse by category below.

Aider

AI pair programming in your terminal. The most local-model-friendly coding agent with a tiny ~2K token system prompt and deep git integration.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

AnythingLLM

All-in-one local AI workspace with best-in-class RAG. Upload documents, chat with your files, build no-code AI agents, and connect 30+ LLM providers.

Hybrid (local + cloud)Runs locallySelf-hostedOpenRouterOllama

View compatibility →

Claude Code

Anthropic's official agentic coding CLI. Reads your entire codebase, plans and executes changes across files, runs tests, and iterates on failures.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

Cline

Open-source AI coding agent for VS Code. Autonomously explores your codebase, edits files, runs terminal commands, and uses browser automation.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

Continue

Open-source AI code assistant for VS Code and JetBrains. Tab autocomplete, chat, and agent mode with separate models per role — like a local Copilot.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

Cursor

AI-native IDE built on VS Code. Integrated tab completion, inline editing, chat, and agent mode — all in one editor. Cloud-only, no local model support.

Cloud/API only

View compatibility →

HuggingChat

Free web chat interface from Hugging Face. Access open-weight models instantly with no setup — runs entirely in the cloud.

Cloud/API only

View compatibility →

Janitor AI

Cloud-based AI character roleplay platform. Largest character library, OpenRouter proxy support, and a massive 500K+ Discord community.

Cloud/API onlyBring your own keyOpenRouter

View compatibility →

Kilo Code

Open-source AI coding agent for VS Code, JetBrains, and CLI. The only Cline-family agent with JetBrains support. Claims highest-volume consumer on OpenRouter.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

KoboldCPP

Single-binary local inference for roleplay and storytelling. GGUF models, zero install, bundled KoboldAI Lite UI. The community go-to for AI storytelling.

Runs locally

View compatibility →

LibreChat

Enterprise self-hosted ChatGPT clone with 30+ AI providers. Multi-user admin panel, OAuth2 SSO, artifacts, code interpreter, and MCP support.

Hybrid (local + cloud)Self-hostedOpenRouterOllama

View compatibility →

llama.cpp

The engine underneath Ollama — but faster. Full control over quants, context, and grammars. Grammar file support enables GPT-OSS tool calling in Cline.

Runs locally

View compatibility →

LM Studio

Desktop app for running local LLMs with zero setup. In-app model browser, visual GPU fit indicator, and one-click GGUF downloads from Hugging Face.

Runs locally

View compatibility →

Ollama

The industry standard for running LLMs locally. Simple CLI, massive model library (100K+), OpenAI-compatible API on port 11434. Powers Open WebUI, Continue, and more.

Runs locallyOllama

View compatibility →

Open WebUI

Self-hosted ChatGPT-like web UI for LLMs. Native Ollama integration, RAG document Q&A, multi-user support, and OpenRouter compatibility.

Runs locallySelf-hostedHybrid (local + cloud)OpenRouterOllama

View compatibility →

Roo Code

Open-source multi-agent AI coding extension for VS Code. Customizable modes for coding, architecture, debugging, and review — with a smaller system prompt than Cline.

Hybrid (local + cloud)OpenRouterOllama

View compatibility →

SillyTavern

Self-hosted chat interface for AI roleplay and creative writing. Deep character creation, lorebooks, group chats, and first-class OpenRouter support.

Hybrid (local + cloud)Runs locallyCloud/API onlyOpenRouterOllama

View compatibility →

text-generation-webui

Power-user local LLM frontend with maximum backend flexibility. Transformers, ExLlamaV2/V3, llama.cpp, GPTQ, AWQ — all in one web UI.

Runs locally

View compatibility →

vLLM

Production-grade LLM serving engine. PagedAttention for efficient KV cache, high throughput, multi-user API serving. For deployments, not single-user chat.

Runs locallySelf-hosted

View compatibility →

Zed

High-performance code editor with AI features via API keys. Bring your own Anthropic or OpenAI key for inline code generation and chat.

Bring your own keyCloud/API only

View compatibility →