CanItRun Logocanitrun.

Frontier LLMs

24models · local AI VRAM requirements & GPU compatibility

Frontier models push the boundary of open-weight AI capability. They typically require a multi-GPU server or extreme-tier workstation — but when they fit, they rival the best proprietary APIs. Check the compatible GPU list carefully; most of these require 80 GB+ of VRAM across multiple cards.

Want to check your specific GPU? Use the homepage calculator to see which of these models fit your hardware with estimated tokens per second.