Integrations
AionUi unifies command-line AI tools in one desktop app. Pick your stack and follow the guide for setup, then use multi-agent mode to run several agents side by side.
Official & community CLI tools
Gemini CLI
Built-in support. Google Gemini from the desktop with sessions and preview. Gemini CLI + AionUi →
Claude Code
Anthropic’s CLI agent in a GUI. Compare with Claude Cowork vs AionUi. Claude Code + AionUi →
Ollama & LM Studio
Run local models with a custom OpenAI-compatible base URL. Local LLMs →
Codex, Qwen Code, OpenCode
Wire additional CLI agents into the same cowork surface. More agents →
How integrations fit together
Each CLI tool speaks its own protocol; AionUi’s job is to surface them uniformly—shared history, preview, and optional automation. You do not pick just one: multi-agent mode is explicitly designed for teams that want Gemini for research, Claude Code for refactors, and Codex for quick scripts without juggling three terminal windows.
Start with the guide for the CLI you rely on today, then add another agent when you are comfortable with LLM routing and disk permissions.
Choosing local vs cloud backends
Cloud-backed CLIs need working API credentials and outbound network access. Local stacks (Ollama & LM Studio) trade vendor latency for privacy and predictable per-seat cost—ideal for air-gapped labs or large batch jobs.