Skip to content

Conversation

quic-jouachen
Copy link
Contributor

@quic-jouachen quic-jouachen commented Sep 23, 2025

This is regarding the issue reported in #572

The finite lorax feature failed to execute when testing on a llama adapter (jumip/llama-lora-adapter) that contains o_proj as target module.

Work in progress -- test to be added to avoid future regression

@quic-jouachen quic-jouachen force-pushed the finitelorax_oproj_fix_rel1.20 branch from 3db3af7 to 7f20f47 Compare September 23, 2025 21:15
@quic-jouachen quic-jouachen changed the title Fix llama model o_proj lora_ids passing for finite lorax Fix llama model o_proj lora_ids passing for finite lorax in release/v1.20.0 Sep 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant