fix torchao quantizer for new torchao versions#12901
fix torchao quantizer for new torchao versions#12901sayakpaul merged 2 commits intohuggingface:mainfrom
Conversation
Summary: `torchao==0.16.0` (not yet released) has some bc-breaking changes, this PR fixes the diffusers repo with those changes. Specifics on the changes: 1. `UInt4Tensor` is removed: pytorch/ao#3536 2. old float8 tensors v1 are removed: pytorch/ao#3510 In this PR: 1. move the logger variable up (not sure why it was in the middle of the file before) to get better error messages 2. gate the old torchao objects by torchao version Test Plan: import diffusers objects with new versions of torchao works: ```bash > python -c "import torchao; print(torchao.__version__); from diffusers import StableDiffusionPipeline" 0.16.0.dev20251229+cu129 ``` Reviewers: Subscribers: Tasks: Tags:
|
cc @sayakpaul, would you know who can help review this? |
sayakpaul
left a comment
There was a problem hiding this comment.
Thanks!
No other changes needed to ensure 0.16 release is smooth?
|
@bot /style |
|
Style bot fixed some files and pushed the changes. |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
@vkuzo Just a question. With the changes introduced in import torch
from diffusers import FluxTransformer2DModel, TorchAoConfig
# assuming this was saved using torchao<=0.15.0
transformer = FluxTransformer2DModel.from_pretrained(
"hf-internal-testing/tiny-flux-transformer",
quantization_config=TorchAoConfig("uint4wo"),
torch_dtype=torch.bfloat16,
)
save_path = "<path to checkpoint>"
transformer.save_pretrained(save_path, safe_serialization=False)
del transformer
# this will break with 0.16.0 since Uint4Tensor is no longer in allowed safe globals?
FluxTransformer2DModel.from_pretrained(save_path) |
Yes, that's correct! For test models, can we just regenerate them using the new v2 configs? I'm actually not sure which specific config is the new version of |
The branch cut did not happen yet, I just made this fix for things we know about so far to unblock some internal work. We can follow up in a separate PR if there are any more bc-breaking changes before the branch cut. By the way, we are adding perf+accuracy benchmarks with diffusers to torchao here: pytorch/ao#3502 |
|
That looks great, just reviewed and commented! Thanks so much for working on it. |
* fix torchao quantizer for new torchao versions Summary: `torchao==0.16.0` (not yet released) has some bc-breaking changes, this PR fixes the diffusers repo with those changes. Specifics on the changes: 1. `UInt4Tensor` is removed: pytorch/ao#3536 2. old float8 tensors v1 are removed: pytorch/ao#3510 In this PR: 1. move the logger variable up (not sure why it was in the middle of the file before) to get better error messages 2. gate the old torchao objects by torchao version Test Plan: import diffusers objects with new versions of torchao works: ```bash > python -c "import torchao; print(torchao.__version__); from diffusers import StableDiffusionPipeline" 0.16.0.dev20251229+cu129 ``` Reviewers: Subscribers: Tasks: Tags: * Apply style fixes --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
* fix torchao quantizer for new torchao versions Summary: `torchao==0.16.0` (not yet released) has some bc-breaking changes, this PR fixes the diffusers repo with those changes. Specifics on the changes: 1. `UInt4Tensor` is removed: pytorch/ao#3536 2. old float8 tensors v1 are removed: pytorch/ao#3510 In this PR: 1. move the logger variable up (not sure why it was in the middle of the file before) to get better error messages 2. gate the old torchao objects by torchao version Test Plan: import diffusers objects with new versions of torchao works: ```bash > python -c "import torchao; print(torchao.__version__); from diffusers import StableDiffusionPipeline" 0.16.0.dev20251229+cu129 ``` Reviewers: Subscribers: Tasks: Tags: * Apply style fixes --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
…ixes huggingface#13104) Add a test to prevent regression of the NameError that occurred when `logger` was used inside `_update_torch_safe_globals()` before being defined at module level. The fix (moving `logger` before the function) was included in PR huggingface#12901, but this test ensures it cannot regress.
Summary:
torchao==0.16.0(not yet released) has some bc-breaking changes, this PR fixes the diffusers repo with those changes. Specifics on the changes:UInt4Tensoris removed: Remove unused UInt4Tensor pytorch/ao#3536In this PR:
Test Plan:
import diffusers objects with new versions of torchao works:
Reviewers:
Subscribers:
Tasks:
Tags:
What does this PR do?
Fixes # (issue)
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.