Constraint not being followed [Resolved] #717
-
Hello! I’ve been encountering some issues with PySRRegressor, where the returned functions occasionally produce complex values. I would like to know if it’s possible to restrict the functions being evaluated to only return real values within the bounds of the x-vector, i.e., for every xi value between the minimum and maximum of x. Although the imaginary component is usually small, I cannot work with complex numbers. I’m also inclined to retain the potentially problematic operators, like "pow" and "log". So far, the only solution I’ve considered is applying constraints on subtraction and addition inside the "pow" and "log" operators. However, I’m not confident this would guarantee real-valued functions, and it might lead to less optimal solutions. Thank you in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @Sorah-Darkhat, PySRRegressor will only return complex values if you train it on complex data. Be sure to verify that you are passing real data to avoid this. e.g., you can check Cheers, |
Beta Was this translation helpful? Give feedback.
Thank you for your answer,
I'm having trouble reproducing the complex error within the domain of the x-vector, so I'm left believing that I was evaluating it outside the domain.
I've also removed the "log" operation, but I've kept the "^" operator, and sometimes PySRRegressor returns functions of the type:
( C1 - x ) ^ C2
Where C1 and C2 are floats, and for values of x > C1, I get complex solutions.
I know it's possible to load constraints, I've added the following constrain:
constrain = { "^" : [1, 3] }, but it doesn't seem to be implementing the constraint.
( I'm loading the model configurations from a json file, and I'm also running with the experimental turbo and bumper ).
This is the…