Synopsis
Description
Thellmfit command launches an interactive Terminal User Interface (TUI) by default. The TUI provides a visual interface for exploring LLM models that fit your system’s hardware specifications.
You can use global flags to customize the behavior or switch to classic CLI table output mode.
Global Flags
These flags work across all subcommands:Output results as JSON for tool integration. Works with all subcommands.
Override GPU VRAM size (e.g., “32G”, “32000M”, “1.5T”). Useful when GPU memory autodetection fails.
Cap context length used for memory estimation (tokens). Must be >= 1. Falls back to
OLLAMA_CONTEXT_LENGTH environment variable if not set.TUI Mode Flags
These flags apply when running the default TUI:Use classic CLI table output instead of TUI. Shows the fit results in a table format and exits.
Show only models that perfectly match recommended specs.
Limit number of results displayed.
Sort column for CLI fit output. Options:
score- Composite ranking score (default)tps- Estimated tokens/second (aliases:tokens,toks,throughput)params- Model parameter countmem- Memory utilization percentage (aliases:memory,mem_pct,utilization)ctx- Context window length (alias:context)date- Release date, newest first (aliases:release,released)use- Use-case grouping (aliases:use_case,usecase)
Usage Examples
Launch TUI (Default)
CLI Table Output
Override GPU Memory
Context Length Control
JSON Output
Example Output
TUI Mode
Launchingllmfit displays an interactive interface:
CLI Table Output
Runningllmfit --cli displays:
