Skip to content

Commit a43e52c

Browse files
committed
conflicts
1 parent b62b606 commit a43e52c

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

keras_hub/src/models/stablelm/stablelm_attention.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,7 @@ class StableLMAttention(keras.layers.Layer):
2121
num_query_heads: int. Number of attention heads for queries.
2222
num_key_value_heads: int. Number of attention heads for keys and
2323
values.
24-
hidden_dim: int. Hidden dimension of the input (e.g., 2560 for
25-
StableLM-3B4E1T).
24+
hidden_dim: int. Hidden dimension of the input.
2625
rope_max_wavelength: float. Maximum wavelength for rotary embeddings
2726
(default: 10000).
2827
rope_scaling_factor: float. Scaling factor for rotary embeddings

0 commit comments

Comments
 (0)