CanItRun Logocanitrun.

Chat Frontends

7apps · local AI compatibility & hardware requirements

Chat frontends are the user interface layer for LLMs. They do not run models themselves — they connect to local backends (Ollama, LM Studio) or cloud APIs (OpenRouter, Anthropic, OpenAI). This means most chat frontends have zero GPU requirements of their own. The GPU requirement comes entirely from the model you choose to connect.

Want to check if your GPU can run the models these apps need? Use the homepage calculator to see which models fit your hardware with estimated tokens per second.