Skip to content

PyLlamaUI is a lightweight Python GUI for running and chatting with local LLMs via Ollama — fully offline, private, and open source. It simplifies prompt input, model switching, and response viewing in a clean desktop interface.

Notifications You must be signed in to change notification settings

bhuvanesh-m-dev/pyllamaui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

54 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyLlamaUI 🐍🦙

A Python-based offline GUI for running AI models with Ollama

Python License: MIT Ollama Open Source


🌟 About

PyLlamaUI is an open-source offline desktop application built with Python that lets you run and chat with large language models (LLMs) using Ollama.

No cloud. No tracking. Just pure local AI — fast and private.


🚀 Features

  • 🖥️ Simple, clean GUI for chatting with LLMs
  • 🔌 Interacts with local Ollama server via REST API
  • 🔄 Load and switch between models (e.g., LLaMA 3, Mistral, etc.)
  • 💾 Save prompt history locally
  • ⚙️ Customizable settings: max tokens, temperature, system prompt
  • 🌓 Light/dark mode support (optional)

🛠️ Tech Stack

  • Python 3.10+
  • customtkinter (or tkinter) for GUI
  • requests for Ollama API calls
  • Ollama (installed locally and running in background)

📸 Screenshots

Coming soon...


📦 Installation

1. Prerequisites

  • 🐍 Python 3.10+
  • 🦙 Ollama installed and running locally
ollama run llama3

About

PyLlamaUI is a lightweight Python GUI for running and chatting with local LLMs via Ollama — fully offline, private, and open source. It simplifies prompt input, model switching, and response viewing in a clean desktop interface.

Resources

Stars

Watchers

Forks

Packages

No packages published