Skip to content

Generalize FakeQuantizer beyond intx #2714

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 8, 2025
Merged

Conversation

andrewor14
Copy link
Contributor

@andrewor14 andrewor14 commented Aug 7, 2025

Stack from ghstack (oldest at bottom):

Summary: Similar to #2628,
but for FakeQuantizer. It is cleaner to isolate the logic of
each quantizer in separate classes, e.g. intx vs nvfp4 vs fp8.
Naming change:

FakeQuantizer -> IntxFakeQuantizer

BC-breaking notes: This is technically not BC-breaking yet
since we are just deprecating the old APIs while keeping them
around. It will be when we do remove the old APIs in the future
according to #2630.

Before:

config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
FakeQuantizer(config)

After:

config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
IntxFakeQuantizer(config) # or
FakeQuantizerBase.from_config(config)

Test Plan:

python test/quantization/test_qat.py

**Summary:** Similar to #2628,
but for `FakeQuantizer`. It is cleaner to isolate the logic of
each quantizer in separate classes, e.g. intx vs nvfp4 vs fp8.
Naming change:

```
FakeQuantizer -> IntxFakeQuantizer
```

**BC-breaking notes:** This is technically not BC-breaking yet
since we are just deprecating the old APIs while keeping them
around. It will be when we do remove the old APIs in the future
according to #2630.

Before:
```
config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
FakeQuantizer(config)
```

After:
```
config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
IntxFakeQuantizer(config) # or
FakeQuantizerBase.from_config(config)
```

**Test Plan:**
```
python test/quantization/test_qat.py
```

[ghstack-poisoned]
Copy link

pytorch-bot bot commented Aug 7, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2714

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 1 Cancelled Job

As of commit 33a8305 with merge base 246b142 (image):

NEW FAILURE - The following job has failed:

CANCELLED JOB - The following job was cancelled. Please retry:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

andrewor14 added a commit that referenced this pull request Aug 7, 2025
**Summary:** Similar to #2628,
but for `FakeQuantizer`. It is cleaner to isolate the logic of
each quantizer in separate classes, e.g. intx vs nvfp4 vs fp8.
Naming change:

```
FakeQuantizer -> IntxFakeQuantizer
```

**BC-breaking notes:** This is technically not BC-breaking yet
since we are just deprecating the old APIs while keeping them
around. It will be when we do remove the old APIs in the future
according to #2630.

Before:
```
config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
FakeQuantizer(config)
```

After:
```
config = IntxFakeQuantizeConfig(torch.int8, "per_channel")
IntxFakeQuantizer(config) # or
FakeQuantizerBase.from_config(config)
```

**Test Plan:**
```
python test/quantization/test_qat.py
```

ghstack-source-id: 3867fab
Pull Request resolved: #2714
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 7, 2025
@andrewor14 andrewor14 added the topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories) label Aug 7, 2025
@jerryzh168 jerryzh168 added topic: bc-breaking Use this tag if this PR breaks backward compatibility and removed topic: bc-breaking Use this tag if this PR breaks backward compatibility labels Aug 7, 2025
@jerryzh168
Copy link
Contributor

if not BC breaking, probably don't need to a bc-breaking note, seems like more of a deprecation note

@andrewor14 andrewor14 changed the base branch from gh/andrewor14/17/base to main August 8, 2025 15:55
@andrewor14 andrewor14 merged commit 6cfa477 into main Aug 8, 2025
39 of 43 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants