-
Notifications
You must be signed in to change notification settings - Fork 65
Unit tests #83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unit tests #83
Conversation
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Anh-Uong <[email protected]>
|
This PR looks good to me. Before we add edge cases, @tharapalanivel can we also add a unit test for fine tuning? there will not be any peft type associated with it |
requirements.txt
Outdated
| accelerate>=0.20.3 | ||
| packaging | ||
| transformers>=4.34.1 | ||
| transformers>=4.34.1,<4.38.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#53 is merged so you shouldnt need this cap, thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ssukriti Should we keep the cap(or a static version) for the transformers package avoid un intended errors like xla_fsdp_v2. We could create github workflow to run tests and then update the cap regularly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we dont need to keep static version , but yes in optional dependencies PR , @gkumbhat is looking into how to cap and we may cap to next major release. Now that CI/CD with automatically pull new release versions , if we see failing builds, we will update accordingly
the errors we were seeing with xla_fsdp_v2 was actually due to code we wrote , which was good to catch and fix . It was not a API change from transformers, but we were setting env variables incorrectly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general if there is a specific version that doesn't work, or has a bug ,then we can also ask pip to ignore that particular version.
add more unit tests and refactor
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
Signed-off-by: Thara Palanivel <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think some more tests can be i:
- if num_epocs and num_gradient_acc steps is 0 , valueerror is returned
- prompt tuning test can test with
"prompt_tuning_init": "RANDOM",andTEXT(only Random is tested)
tests/test_sft_trainer.py
Outdated
| assert "Simply put, the theory of relativity states that" in output_inference | ||
|
|
||
|
|
||
| def test_run_train_lora_target_modules(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
whats the difference between this and above test? can we combine to 1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My understanding is that first we check that default target modules are used, then the next one is for custom target modules specified by user and the last for all-linear. I've parameterized it but worth confirming with Anh.
Signed-off-by: Thara Palanivel <[email protected]>
| @@ -0,0 +1,22 @@ | |||
| # Copyright The IBM Tuning Team | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Curious about the copyright notice..Where is this coming from?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IBM Tuning Team was suggested by Raghu, the rest is from caikit
| invalid_params | ||
| ) | ||
|
|
||
| with pytest.raises(ValueError, match=exc_msg): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I generally avoid matching exact error message and just check for valueError with a comment explaining why, but will let this go and I dont think we will update the message much
Ssukriti
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much!!
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <[email protected]> * Add basic unit tests Signed-off-by: Thara Palanivel <[email protected]> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <[email protected]> * Ignore aim log files Signed-off-by: Thara Palanivel <[email protected]> * Include int num_train_epochs Signed-off-by: Thara Palanivel <[email protected]> * Fix formatting Signed-off-by: Thara Palanivel <[email protected]> * Add copyright notice Signed-off-by: Thara Palanivel <[email protected]> * Address review comments Signed-off-by: Thara Palanivel <[email protected]> * Run inference on tuned model Signed-off-by: Thara Palanivel <[email protected]> * Trainer downloads model Signed-off-by: Thara Palanivel <[email protected]> * add more unit tests and refactor Signed-off-by: Anh-Uong <[email protected]> * Fix formatting Signed-off-by: Thara Palanivel <[email protected]> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <[email protected]> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <[email protected]> * Address review comments Signed-off-by: Thara Palanivel <[email protected]> --------- Signed-off-by: Thara Palanivel <[email protected]> Signed-off-by: Anh-Uong <[email protected]> Co-authored-by: Anh-Uong <[email protected]>
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <[email protected]> * Add basic unit tests Signed-off-by: Thara Palanivel <[email protected]> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <[email protected]> * Ignore aim log files Signed-off-by: Thara Palanivel <[email protected]> * Include int num_train_epochs Signed-off-by: Thara Palanivel <[email protected]> * Fix formatting Signed-off-by: Thara Palanivel <[email protected]> * Add copyright notice Signed-off-by: Thara Palanivel <[email protected]> * Address review comments Signed-off-by: Thara Palanivel <[email protected]> * Run inference on tuned model Signed-off-by: Thara Palanivel <[email protected]> * Trainer downloads model Signed-off-by: Thara Palanivel <[email protected]> * add more unit tests and refactor Signed-off-by: Anh-Uong <[email protected]> * Fix formatting Signed-off-by: Thara Palanivel <[email protected]> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <[email protected]> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <[email protected]> * Address review comments Signed-off-by: Thara Palanivel <[email protected]> --------- Signed-off-by: Thara Palanivel <[email protected]> Signed-off-by: Anh-Uong <[email protected]> Co-authored-by: Anh-Uong <[email protected]>
Description of the change
Adding unit tests for
ptandloratuning method using dummy model, edge cases, invalid requests, etc.Cont. of PR #79
Related issue number
Closes #74
How to verify the PR
Was the PR tested