Skip to content

LLM Scheduler is an open-source large language model scheduling and task management tool designed for developers and enterprises. It provides unified multi-model task management, priority scheduling, and status tracking capabilities.

License

Notifications You must be signed in to change notification settings

carlos-code-max/LLM-Scheduler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Scheduler - Open Source LLM Task Management Platform

Project Overview

LLM Scheduler is an open-source large language model scheduling and task management tool designed for developers and enterprises. It provides unified multi-model task management, priority scheduling, and status tracking capabilities.

Key Features

  • Multi-Model Support - Support for OpenAI API, local LLaMA, and various other large language models
  • Intelligent Scheduling - Priority-based task queue scheduling with rate limiting and backpressure mechanisms
  • Visual Management - Built-in Dashboard with real-time task and model status display
  • Plugin Extensibility - Support for custom scheduling strategies and task types
  • Ready to Deploy - One-click Docker deployment with complete deployment solutions

Technical Architecture

┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│  Dashboard      │    │  Go HTTP Server │    │  MySQL Database │
└─────────────────┘    └─────────────────┘    └─────────────────┘
                                │
                    ┌─────────────────────────┐
                    │  Task Scheduling Queue  │
                    └─────────────────────────┘

Quick Start

Using Docker Compose (Recommended)

# Clone the repository
git clone [email protected]:carlos-code-max/LLM-Scheduler.git
cd LLM-Scheduler

# Start all services
docker-compose up -d

# Access Dashboard
open http://localhost:3000

Manual Deployment

  1. Start Backend Service
cd backend
go mod tidy
go run main.go
  1. Start Frontend
cd frontend
npm install
npm start
  1. Configure Database
# Import database schema
mysql -u root -p < scripts/init.sql

API Documentation

Task Management

  • POST /api/tasks - Submit new task
  • GET /api/tasks - Get task list
  • GET /api/tasks/:id - Get task details
  • PUT /api/tasks/:id/priority - Adjust task priority

Model Management

  • GET /api/models - Get model list
  • POST /api/models - Register new model
  • PUT /api/models/:id - Update model configuration

Task Types

  • text-generation - Text generation tasks
  • embedding - Text vectorization
  • translation - Text translation
  • summarization - Text summarization
  • custom - Custom task types

Configuration

Main configuration file: backend/config.yaml

server:
  port: 8080
  
database:
  host: localhost
  port: 3306
  username: root
  password: password
  database: llm_scheduler

redis:
  host: localhost
  port: 6379
  db: 0

Contributing

Issues and Pull Requests are welcome!

  1. Fork this repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

LLM Scheduler is an open-source large language model scheduling and task management tool designed for developers and enterprises. It provides unified multi-model task management, priority scheduling, and status tracking capabilities.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published