Zhenglin Cheng* · Peng Sun* · Jianguo Li · Tao Lin
Join the WeChat Group, feel free to reach out anytime if you have any questions!👇
👇 WeChat Group QR Code/微信群二维码 👇
| Technical Discussion Group/技术讨论群 | Model Users Discussion Group/AIGC模型使用讨论群 |
|---|---|
![]() |
![]() |
- Thanks to @mengqin for adapting more compatible TwinFlow-models workflows in ComfyUI! 👏🏻
- Thanks to @smthemex for adapting TwinFlow-models workflows in ComfyUI! 👏🏻
- We release experimental version of faster Z-Image-Turbo!
- We release training code and better TwinFlow implementation on SD3.5 and OpenUni under
srcdirectory 👏🏻. - We release tutorials on MNIST to provide core implementation of TwinFlow!
- We release TwinFlow-Qwen-Image-v1.0! And we are also working on Z-Image-Turbo to make it faster!
- Release inference and sampler code for TwinFlow-Qwen-Image-v1.0.
- Release training tutorials on MNIST for understanding.
- Release training code on SD3.5 and OpenUni.
- Release faster experimental version of Z-Image-Turbo.
- Release large-scale training code.
Case 1: 万里长城秋景,蜿蜒盘踞于层峦叠嶂的山脉之上,砖石城墙与烽火台在暖阳下呈现古朴的土黄色,山间枫叶如火般绚烂,游客点缀其间,远山薄雾缭绕,天空湛蓝飘着几朵白云,高角度全景构图,细节丰富,光影柔和。
Case2: 超高清壁纸, 梦幻光影, 少女在元宵灯会中回眸一笑, 提着一盏兔子花灯, 周围挂满明亮的灯笼, 暖色调灯光映照在脸上, 华丽的唐装, 繁复的头饰, 热闹的背景虚化, 焦外光斑美丽, 中景镜头。
Same prompt but different noise (left to right). Top to bottom shown are: Qwen-Image (50×2 NFE), TwinFlow-Qwen-Image (1-NFE), and Qwen-Image-Lightning-v2.0 (1-NFE).
TwinFlow-Qwen-Image generates high-quality images at 1-NFE while preserving strong diversity.
We introduce TwinFlow, a framework that realizes high-quality 1-step and few-step generation without the pipeline bloat.
Instead of relying on external discriminators or frozen teachers, TwinFlow creates an internal "twin trajectory". By extending the time interval to
Then, the model can rectify itself by minimizing the difference of the velocity fields between real trajectory and fake trajectory, i.e. the
Key Advantages:
- One-model Simplicity. We eliminate the need for any auxiliary networks. The model learns to rectify its own flow field, acting as the generator, fake/real score. No extra GPU memory is wasted on frozen teachers or discriminators during training.
- Scalability on Large Models. TwinFlow is easy to scale on 20B full-parameter training due to the one-model simplicity. In contrast, methods like VSD, SiD, and DMD/DMD2 require maintaining three separate models for distillation, which not only significantly increases memory consumption—often leading OOM, but also introduces substantial complexity when scaling to large-scale training regimes.
For ComfyUI users, please see https://github.com/smthemex/ComfyUI_TwinFlow.
Install the latest diffusers:
pip install git+https://github.com/huggingface/diffusersRun inference demo inference.py:
python inference.pyWe recommend to sample for 2~4 NFEs:
# 4 NFE config
sampler_config = {
"sampling_steps": 4,
"stochast_ratio": 1.0,
"extrapol_ratio": 0.0,
"sampling_order": 1,
"time_dist_ctrl": [1.0, 1.0, 1.0],
"rfba_gap_steps": [0.001, 0.5],
}
# 2 NFE config
sampler_config = {
"sampling_steps": 2,
"stochast_ratio": 1.0,
"extrapol_ratio": 0.0,
"sampling_order": 1,
"time_dist_ctrl": [1.0, 1.0, 1.0],
"rfba_gap_steps": [0.001, 0.6],
}@article{cheng2025twinflow,
title={TwinFlow: Realizing One-step Generation on Large Models with Self-adversarial Flows},
author={Cheng, Zhenglin and Sun, Peng and Li, Jianguo and Lin, Tao},
journal={arXiv preprint arXiv:2512.05150},
year={2025}
}
@misc{sun2025anystep,
author = {Sun, Peng and Lin, Tao},
note = {GitHub repository},
title = {Any-step Generation via N-th Order Recursive Consistent Velocity Field Estimation},
url = {https://github.com/LINs-lab/RCGM},
year = {2025}
}
@article{sun2025unified,
title = {Unified continuous generative models},
author = {Sun, Peng and Jiang, Yi and Lin, Tao},
journal = {arXiv preprint arXiv:2505.07447},
year = {2025},
url = {https://arxiv.org/abs/2505.07447},
archiveprefix = {arXiv},
eprint = {2505.07447},
primaryclass = {cs.LG}
}TwinFlow is built upon RCGM and UCGM, with much support from InclusionAI.
Note: The LINs Lab has openings for PhD students for the Fall 2026/2027 intake. Interested candidates are encouraged to reach out.






