Skip to content

Commit 70fd41c

Browse files
committed
Solve merge conflicts.
1 parent d4840a4 commit 70fd41c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

lmdeploy/pytorch/kernels/ascend/paged_attention_fwd.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ def flash_context_attention(
3333
key=key_states,
3434
value=value_states,
3535
q_start_loc=q_start_loc[i:i + 1],
36-
seq_len_list=q_seq_len[i:i + 1],
36+
seq_len_list=q_seq_len_list[i:i + 1],
3737
num_q_heads=num_q_heads,
3838
num_kv_heads=num_kv_heads,
3939
attn_mask=context.attention_mask[i:i + 1],

0 commit comments

Comments
 (0)