A Python-based offline GUI for running AI models with Ollama
PyLlamaUI is an open-source offline desktop application built with Python that lets you run and chat with large language models (LLMs) using Ollama.
No cloud. No tracking. Just pure local AI — fast and private.
- 🖥️ Simple, clean GUI for chatting with LLMs
- 🔌 Interacts with local Ollama server via REST API
- 🔄 Load and switch between models (e.g., LLaMA 3, Mistral, etc.)
- 💾 Save prompt history locally
- ⚙️ Customizable settings: max tokens, temperature, system prompt
- 🌓 Light/dark mode support (optional)
- Python 3.10+
customtkinter(ortkinter) for GUIrequestsfor Ollama API calls- Ollama (installed locally and running in background)
Coming soon...
- 🐍 Python 3.10+
- 🦙 Ollama installed and running locally
ollama run llama3