Skip to content

Conversation

yisi-wang-slalom
Copy link

@yisi-wang-slalom yisi-wang-slalom commented May 23, 2024

Description of changes:

We are excited to help adding support for Gemma model architectured by Google.

A few highlights, mainly referencing: huggingface/transformers#29402

  • Using gelu_new activation function, we've validated it against the PyTorch implementation of gelu_pytorch_tanh to ensure same result, and recommend implementing gelu_pytorch_tanh when possible.
  • Add Layernorm (w+1) of RMS Layernorm for gemma
  • Include additional normalization in embedding (multiplies the embeddings by sqrt(hidden_dim)) for gemma

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@aws-maens aws-maens requested a review from mmcclean-aws July 9, 2024 20:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant