Desktop App — v1.0
Prefy for Desktop
Native app. No browser limitations. Direct access to your local AI models.
⬇️ Download for macOS (Apple Silicon)
6.4 MB · macOS 11+ · Apple Silicon (M1/M2/M3/M4)
Windows & Linux coming soon
Why Desktop?
🔓
No CORS
Direct access to Ollama, LM Studio, vLLM — no browser restrictions
⚡
6MB App
Tauri-powered. 30x smaller than Electron. Native performance.
🔒
Full Privacy
Your data never leaves your machine. No cloud. No tracking.
🤖
Auto-detect
Finds Ollama, LM Studio automatically. One click to connect.
🖥️
System Tray
Always running in background. Quick access anytime.
🔄
Auto-update
New features delivered automatically. Always up to date.
Quick Start
1
Install Ollama
Download Ollama — free, open source AI runtime
2
Download a model
ollama pull qwen3:8b3
Open Prefy Desktop
It auto-detects Ollama. Start chatting privately.
Desktop vs Web vs Mobile
| Feature | 🖥️ Desktop | 🌐 Web | 📱 Mobile |
|---|---|---|---|
| Local AI (Ollama) | ✅ Native | ⚠️ CORS setup | 🔗 Via network |
| Size | 6 MB | 0 (browser) | ~25 MB |
| Offline AI | ✅ Full | ⚡ WebLLM | 🔜 Coming |
| CORS issues | ❌ None | ⚠️ Possible | ❌ None |
| System tray | ✅ | ❌ | ❌ |
| Notifications | ✅ Native | ⚠️ Browser | ✅ Native |
| Auto-update | ✅ | ✅ | ✅ |
⬇️ Download Prefy Desktop
Free forever. No account required. Your data stays yours.