Skip to content

BAI-LAB/MoE-CL

Repository files navigation

MoE-CL

Environment Setup

pip install -r requirements.txt

Model Preparation

Download the Llama-2-7b-hf model into the ../model/Llama-2-7b-hf/ directory.

Training

Run the full continual learning training (including random initialization baseline + continual learning sequence DBPedia → Amazon → Yahoo → AGNews):

bash scripts/mtl5/run_moe-cl.sh

Evaluation

Evaluate the trained model:

python evaluate_mtl5.py \
    --base_model ../model/Llama-2-7b-hf \
    --config configs/mtl5/moe-cl.json \
    --order order1 \
    --load_adapter_file agnews_finetuned

Output Results

  • Training results are saved in the results/ directory
  • Training logs are saved in the logs/ directory

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published