You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the scikit-learn documents, validations should be done in the fit method because if we call set_params, it will bypass the validation in the constructor.
Reproducing code example:
frompysindy.optimizersimportSR3opt=SR3(threshold=-1)
# raises "ValueError: threshold cannot be negative"
frompysindy.optimizersimportSR3opt=SR3()
opt.set_params(threshold=-1)
# no error
The text was updated successfully, but these errors were encountered:
This is a good point, although fairly low impact, since I believe set_params() is only used in gridsearch, which is not done much with SINDy, and we can cheat by doing the same validation in set_params() that we do in __init__(). That said, this is something we want, and the way to do this is:
list out all classes that break the scikit-learn API in differentiation, feature library, and optimizer
extract all validation into a helper function, refactored to the end of __init__()
Modify tests for bad argument combinations to require __init__ to pass, and then
Right now, parameters validation is done in the constructor as follows
pysindy/pysindy/optimizers/sr3.py
Lines 138 to 163 in 3e8a445
According to the scikit-learn documents, validations should be done in the fit method because if we call set_params, it will bypass the validation in the constructor.
Reproducing code example:
The text was updated successfully, but these errors were encountered: