Constraining asymptotic behavior #324
Replies: 3 comments 3 replies
-
Great question. Actually the simplest and perhaps best way to constraint asymptotic behavior is to add datapoints in the direction of the asymptotes. While for other ML techniques, it wouldn't work, for SR this actually is very effective. For example, for your constraint, I would add: large_number = 100_000
x = np.append(x, large_number)
y = np.append(y, 0)
large_number_2 = 1_000_000
x = np.append(x, large_number2)
y = np.append(y, 0) assuming that You can also add some |
Beta Was this translation helpful? Give feedback.
-
One other option is to try this package: https://github.com/JuliaMath/Richardson.jl which can numerically estimate limits. This is perhaps the "proper" way to do it, but I'm not sure how effective it would be. You can see how to install Julia packages into PySR in this example: https://astroautomata.com/PySR/examples/#7-julia-packages-and-types (or one could also use the Julia backend directly). Here's an example of using Richardson: using Richardson
function my_custom_objective(tree, dataset::Dataset{T,L}, options)::L where {T,L}
prediction, flag = eval_tree_array(tree, dataset.X, options)
if !flag
return L(Inf)
end
limit_val, precision = extrapolate(1.0, x0=Inf) do x
out, completed = eval_tree_array(tree, [x]', options)
!completed && (out .= NaN)
return out[1]
end
!isfinite(limit_val) && return L(Inf)
prediction_loss = sum((prediction .- dataset.y) .^ 2) / dataset.n
asymptote_loss = (limit_val - 0.0) ^ 2
loss = prediction_loss + 100 * asymptote_loss
return loss
end |
Beta Was this translation helpful? Give feedback.
-
Dear @MilesCranmer, Thank you for the prompt and super-helpful replies! -Dominik |
Beta Was this translation helpful? Give feedback.
-
Moved from #276:
@dominik-rehse:
Beta Was this translation helpful? Give feedback.
All reactions