Skip to content

Conversation

@lisjin
Copy link
Contributor

@lisjin lisjin commented Dec 3, 2025

Adding our pruning-aware training (PAT) library as a prototype. The original library is under fairinternal/qpat but we would like to surface it in torchao for broader adoption.

The interface is almost identical to torchao.prototype.parq, but we use (group) Lasso instead of piecewise-affine regularization. More details on code organization and usage can be found in the README.

@lisjin lisjin requested a review from andrewor14 December 3, 2025 21:54
@pytorch-bot
Copy link

pytorch-bot bot commented Dec 3, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3429

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 4f78b65 with merge base 51fd90e (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 3, 2025
@lisjin lisjin added the topic: new feature Use this tag if this PR adds a new feature label Dec 3, 2025
@lisjin
Copy link
Contributor Author

lisjin commented Dec 8, 2025

@andrewor14 Let me know if anything needs to be cleared up in this diff. I'm hoping to update D88501706 so that it imports from torchao.prototype.pat instead of copying code.

@meta-codesync
Copy link

meta-codesync bot commented Dec 8, 2025

@lisjin has imported this pull request. If you are a Meta employee, you can view this in D88638093.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: new feature Use this tag if this PR adds a new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants