You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed the following behaviour for the distribution-based KL and JS distances. In contrast to euclidean as a reference, kl_divergence and js_divergence distances do not propagate NaNs, leading to negative distances in the examples below. I don't know if this is desired behaviour; in my opinion they should rather propagate NaNs instead.
I noticed the following behaviour for the distribution-based KL and JS distances. In contrast to
euclidean
as a reference,kl_divergence
andjs_divergence
distances do not propagateNaN
s, leading to negative distances in the examples below. I don't know if this is desired behaviour; in my opinion they should rather propagateNaN
s instead.The text was updated successfully, but these errors were encountered: