Skip to content

lxmfy/ollama-bot

Repository files navigation

ollama-bot

DeepSource Build and Publish Docker Image

Interact with Ollama LLMs using LXMFy bot framework.

showcase

Setup

curl -o .env https://raw.githubusercontent.com/lxmfy/ollama-bot/main/.env-example

edit .env with your Ollama API URL, Model, and LXMF address.

Installation and Running

pipx install git+https://github.com/lxmfy/ollama-bot.git

Run

lxmfy-ollama-bot

Poetry

poetry install
poetry run lxmfy-ollama-bot

Docker

First, pull the latest image:

docker pull ghcr.io/lxmfy/ollama-bot:latest

Then, run the bot, mounting your .env file:

docker run -d \
  --name ollama-bot \
  --restart unless-stopped \
  --network host \
  -v $(pwd)/.env:/app/.env \
  ghcr.io/lxmfy/ollama-bot:latest

Commands

Command prefix: /

/help - show help message
/about - show bot information

Chat

Send any message without the / prefix to chat with the AI model.

The bot will automatically respond using the configured Ollama model.

About

LXMFy bot to interact with Ollama

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •