-
Notifications
You must be signed in to change notification settings - Fork 66
Add support for higher-order derivatives #233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…cessary `return` in rrule
|
Your PR no longer requires formatting changes. Thank you for your contribution! |
Codecov Report✅ All modified and coverable lines are covered by tests.
... and 24 files with indirect coverage changes 🚀 New features to boost your workflow:
|
lkdvos
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have only one minor comment, otherwise happy to merge this.
| @non_differentiable TensorOperations.tensorcontract_structure(args...) | ||
| @non_differentiable TensorOperations.tensorcontract_type(args...) | ||
| @non_differentiable TensorOperations.tensoralloc_contract(args...) | ||
| @non_differentiable Base.promote_op(args...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you know if your example also works if this isn't included? We can't really do this here since that is type piracy, so we would have to find an alternative way around this...
| @non_differentiable Base.promote_op(args...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately, the example fails if this is removed. To avoid type piracy, would this alternative be acceptable?
@non_differentiable TensorOperations.promote_contract(args...)
@non_differentiable TensorOperations.promote_add(args...)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is completely fine, thank you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Updated in the latest commit.
This closes #227 by following changes
@non_differentiableforBase.promote_optensorscalarto avoid inplace functionreturnfromrrule