-
Notifications
You must be signed in to change notification settings - Fork 252
Bump PyTorch to 2.7.0 #3455
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Bump PyTorch to 2.7.0 #3455
Conversation
7889725
to
f917d51
Compare
It appears that F.linear() is the source of indeterminism. When model is on the
leads to an error:
Then, with The same thing may happen with qat-lora sample on CPU. Propose updating references, since with |
Changes
Update pytorch to 2.7.0
Regenerate ref data for test_generate_text_data_functional, changed output of
hf-internal-testing/tiny-random-gpt2
Tests
manual/job/post_training_quantization/661/ - fail FX on save compressed models
nightly/job/TriggerBetta/1029/ -
examples :
wc - pass
Test_Install - pass