Skip to content

scaleoutsystems/MoE

Repository files navigation

MoE - Mixture of Experts for Fleet Intelligence

Repository for the Vinnova project "Mixture of Experts models Tailored for Fleet Intelligence"

Project Structure

This repository is designed for collaborative development with multiple people working simultaneously:

  • architecture/ - Shared, reusable model components (backbone, experts, router, MoE)
  • training/ - Shared training utilities and frameworks
  • pipelines/ - Experiment-specific implementations that use the shared modules

Quickstart

Setup Environment

python3 -m venv .venv
source .venv/bin/activate
pip3 install -e . -U

Development Workflow

  1. Shared Components: Add reusable model components to architecture/ or training utilities to training/
  2. New Experiments: Create new experiment directories in research_pipelines/ (e.g., research_pipelines/zod_fl/, research_pipelines/cifar10/)
  3. Experiment Structure: Each experiment should have:
    • config.py - Experiment-specific configuration
    • dataset.py - Dataset loading and preprocessing
    • model.py - Model building using shared components
    • train.py - Training script, preferably using shared trainer
  4. Established Experiments: Established experiments can be moved to production_pipelines/ (with good test coverage). Add a README.md with key findings and learnings.

About

Repository for the Vinnova project "Mixture of Experts models Tailored for Fleet Intelligence"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages