Replies: 1 comment 1 reply
-
You can write a custom loss function to penalize it if it doesn’t use the variable: function my_loss(tree, dataset::Dataset{T,L}, options) where {T,L}
prediction, flag = eval_tree_array(tree, dataset.X, options)
if !flag
return L(Inf)
end
y = dataset.y
loss = L(sum(i -> abs2(prediction[i] - y[i]), eachindex(y)))
# Check if any node in the tree is feature 5
# (note that Julia indexes from 1 rather than 0)
contains_x5 = any(
node -> (
node.degree == 0
&& !(node.constant)
&& node.feature == 5
),
tree
)
if !(contains_x5)
# penalty term
loss += L(1000)
end
return loss
end Then you would pass this to the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using SR to model a thermodynamical process.
The input "X" is a n-by-m matrix, where there are some feature in "m" is very important and should be include in the final equation.
In the basic SR process, like X(n_samples, n_features), does it possible to set different weight of features in X?
Or does it allow to force include specific feature?
Beta Was this translation helpful? Give feedback.
All reactions