Skip to content

Commit

Permalink
add within_gradient
Browse files Browse the repository at this point in the history
  • Loading branch information
mcabbott committed Aug 30, 2022
1 parent 806b0ef commit 5fd4a91
Show file tree
Hide file tree
Showing 4 changed files with 22 additions and 1 deletion.
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "NNlib"
uuid = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
version = "0.8.9"
version = "0.8.10"

[deps]
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
Expand Down
1 change: 1 addition & 0 deletions docs/src/reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,4 +130,5 @@ ctc_loss
```@docs
logsumexp
NNlib.glu
NNlib.within_gradient
```
15 changes: 15 additions & 0 deletions src/utils.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,18 @@
"""
within_gradient(x) --> Bool
Returns `false` except when used inside a `gradient` call, when it returns `true`.
Useful for Flux regularisation layers which behave differently during training and inference.
Works with any ChainRules-based differentiation package, in which case `x` is ignored.
But Tracker.jl overloads `with_gradient(x::TrackedArray)`, thus for widest use you should
pass it an array whose gradient is of interest.
"""
within_gradient(x) = false

ChainRulesCore.rrule(::typeof(within_gradient), x) = true, _ -> (NoTangent(), NoTangent())


"""
safe_div(x, y)
Expand Down
5 changes: 5 additions & 0 deletions test/utils.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
@testset "within_gradient" begin
@test NNlib.within_gradient([1.0]) === false
@test gradient(x -> NNlib.within_gradient(x) * x, 2.0) == (1.0,)
end

@testset "maximum_dims" begin
ind1 = [1,2,3,4,5,6]
@test NNlib.maximum_dims(ind1) == (6,)
Expand Down

0 comments on commit 5fd4a91

Please sign in to comment.