-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Best practice when taking the grad for a PCG #127
Comments
Differentiating with respect to options is currently not supported #104 (comment). (And options could, in principle, be all kinds of things.) Do you need the derivative of the solution with respect to the initial values? Or do you need a derivative of something else with respect to |
Hi @johannahaffner, |
I think you want to wrap your As Johanna notes then But I think in your case then no gradients are probably what you want. |
I had tried that, but since the preconditioner is a (FWIW |
Hmm probably Maybe we should just always apply such a stop-gradient to preconditioners? Mathematically the output should not depend on the preconditioner anyway, and it would save users from having to do this. |
I think so yes. Example
so stopping the gradient on an array will set the entier autodiff graph to 0 |
Hi, thanks a lot for your amazing work with this package!
I currently have an issue when taking the auto-differentiation of
linear_solve
when a preconditioner is involved inoptions
, as it leads toRuntimeError: Unexpected tangent.
lineax.linear_solve(..., options=...)cannot be autodifferentiated.
Here is a minimal working example (thanks @ASKabalan for this!) to reproduce the error
I am taking the differentiation of a
linear_solve
with preconditioner in the context of a minimization (usingoptax
oroptimistix
), so where ajax.grad
is computed at each step of the minimization and the preconditioner should depend on the updated parameters.What would be the best practice in this context?
The text was updated successfully, but these errors were encountered: