Skip to content

Commit

Permalink
Merge pull request #22 from GiggleLiu/adapt-dataview
Browse files Browse the repository at this point in the history
adapt dataview
  • Loading branch information
GiggleLiu authored Jun 28, 2020
2 parents a6e3eea + ca18e13 commit 0477e0f
Show file tree
Hide file tree
Showing 35 changed files with 298 additions and 275 deletions.
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "NiLang"
uuid = "ab4ef3a6-0b42-11ea-31f6-e34652774712"
authors = ["JinGuo Liu", "thautwarm"]
version = "0.5.1"
version = "0.6.0"

[deps]
FixedPointNumbers = "53c48c17-4a7d-5ca2-90c5-79b7896eea93"
Expand All @@ -13,7 +13,7 @@ TupleTools = "9d95972d-f1c8-5527-a6e0-b4b365fa01f6"
[compat]
FixedPointNumbers = "0.6"
MatchCore = "0.1"
NiLangCore = "0.5,0.6"
NiLangCore = "0.7"
Reexport = "0.2"
TupleTools = "1.2"
julia = "1.3,1.4"
Expand Down
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@

NiLang.jl (逆lang), is a reversible domain-specific language (DSL) that allow a program to go back to the past.

* Requires Julia version >= 1.3.
* If test breaks, try using the master branch of `NiLangCore`.
* **The `'` notation has been removed recently!**
* Requires Julia version >= 1.3,
* If test breaks, try using the master branch of `NiLangCore`,
* The `'` notation has been removed recently to avoid potential conflict with other packages,
* Now a dataview is specified by `x |> bijection`, e.g. the preivous `grad(x)` now should be written as `x |> grad` in the reversible context.
* Our paper uses version v0.6, which might be different from master branch.


NiLang features:
Expand Down Expand Up @@ -127,7 +129,7 @@ julia> y!, x = 0.0, 1.6
(0.0, 1.6)

# first order gradients
julia> @instr Grad(iexp)(Val(1), y!, x)
julia> @instr Grad(PlusEq(exp))(Val(1), y!, x)

julia> grad(x)
4.9530324244260555
Expand All @@ -138,7 +140,7 @@ julia> y!, x = 0.0, 1.6
# second order gradient by differentiate first order gradients
julia> using ForwardDiff: Dual

julia> _, hxy, hxx = Grad(iexp)(Val(1),
julia> _, hxy, hxx = Grad(PlusEq(exp))(Val(1),
Dual(y!, zero(y!)), Dual(x, one(x)));

julia> grad(hxx).partials[1]
Expand Down
8 changes: 4 additions & 4 deletions docs/src/extend.md
Original file line number Diff line number Diff line change
Expand Up @@ -129,23 +129,23 @@ The general approach is *Binding the backward rule on its inverse*!
```julia
@i @inline function IROT(a!::GVar, b!::GVar, θ::GVar)
IROT(value(a!), value(b!), value(θ))
NEG(value(θ))
-(value(θ))
value(θ) -= π/2
ROT(grad(a!), grad(b!), value(θ))
grad(θ) += value(a!) * grad(a!)
grad(θ) += value(b!) * grad(b!)
value(θ) += π/2
NEG(value(θ))
-(value(θ))
ROT(grad(a!), grad(b!), π/2)
end

@i @inline function IROT(a!::GVar, b!::GVar, θ::Real)
IROT(value(a!), value(b!), θ)
NEG(θ)
-(θ)
θ -= π/2
ROT(grad(a!), grad(b!), θ)
θ += π/2
NEG(θ)
-(θ)
ROT(grad(a!), grad(b!), π/2)
end

Expand Down
2 changes: 1 addition & 1 deletion docs/src/grammar.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ DataViews : 0
DataView : DataView '[' <JuliaExpr> ']'
| DataView '.' <ident>
| <JuliaExpr> '(' DataView ')'
| DataView '|>' <JuliaExpr>
| DataView '\''
| '-' DataView
| Constant
Expand Down
2 changes: 1 addition & 1 deletion docs/src/instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The list of reversible instructions that implemented in NiLang
| $y \mathrel{+}= \sin(x)$ | $y+\sin x, x$ |
| $y \mathrel{+}= \cos(x)$ | $y+\cos x, x$ |
| $y \mathrel{+}= {\rm abs}(x)$ | $y+ |x|, x$ |
| ${\rm NEG}(y)$ | $-y$ |
| $-(y)$ | $-y$ |
| ${\rm CONJ}(y)$ | $y'$ |

"." is the broadcasting operations in Julia.
Expand Down
4 changes: 2 additions & 2 deletions docs/src/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
| x ← val | allocate a new variable `x`, with an initial value `val` (a constant). |
| x → val | deallocate variable `x` with content `val`. |
| x += f(y) | a reversible instruction. |
| x .+= f(y) | instruction call with broadcasting. |
| x .+= f.(y) | instruction call with broadcasting. |
| f(y) | a reversible function. |
| f.(y) | function call with broadcasting. |
| if (pre, post) ... end | if statement. |
Expand All @@ -19,7 +19,7 @@
| @routine ... | record a routine in the **routine stack**. |
| ~@routine | place the inverse of the routine on **routine stack** top. |

The condition in if and while statements are a bit hard to digest, please refer our paper [arXiv:2003.04617](https://arxiv.org/abs/2003.04617).
The condition expression in **if** and **while** statements are a bit hard to digest, please refer our paper [arXiv:2003.04617](https://arxiv.org/abs/2003.04617).

## A reversible program

Expand Down
46 changes: 32 additions & 14 deletions examples/Symbolics/print_jacobians.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,23 @@ using NiLang, NiLang.AD

include("symlib.jl")
NiLang.AD.isvar(sym::Basic) = true
NiLang.AD.GVar(sym::Basic) = GVar(sym, zero(sym))

# a patch for symbolic IROT
@i @inline function NiLang.IROT(a!::GVar{<:Basic}, b!::GVar{<:Basic}, θ::GVar{<:Basic})
IROT(value(a!), value(b!), value(θ))
NEG(value(θ))
value(θ) -= Basic(π)/2
ROT(grad(a!), grad(b!), value(θ))
grad(θ) += value(a!) * grad(a!)
grad(θ) += value(b!) * grad(b!)
value(θ) += Basic(π)/2
NEG(value(θ))
ROT(grad(a!), grad(b!), Basic(π)/2)
IROT(a!.x, b!.x, θ.x)
-.x)
θ.x -= Basic(π)/2
ROT(a!.g, b!.g, θ.x)
θ.g += a!.x * a!.g
θ.g += b!.x * b!.g
θ.x += Basic(π)/2
-.x)
ROT(a!.g, b!.g, Basic(π)/2)
end

NiLang.INC(x::Basic) = x + one(x)
NiLang.DEC(x::Basic) = x - one(x)
NiLang.NEG(x::Basic) = -x
@inline function NiLang.ROT(i::Basic, j::Basic, θ::Basic)
a, b = rot(i, j, θ)
a, b, θ
Expand All @@ -30,25 +30,43 @@ end
Base.sincos(x::Basic) = (sin(x), cos(x))

function printall()
syms = (Basic(:a), Basic(:b), Basic(:c))
syms = [Basic(:a), Basic(:b), Basic(:c)]

for subop in [identity, *, /, ^, exp, log, sin, cos]
for opm in [, ]
@show opm
op = opm(subop)
@show op
printone(op, syms)
end
end
for op in [NEG, ROT, IROT]
for op in [-, ROT, IROT]
printone(op, syms)
end
# abs, conj
end

@i function jf1(op, x)
op(x[1])
end

@i function jf2(op, x)
op(x[1], x[2])
end

@i function jf3(op, x)
op(x[1], x[2], x[3])
end

"""print the jacobian of one operator"""
function printone(op, syms)
n = nargs(op)
jac = jacobian_repeat(Basic, op, syms[1:nargs(op)])
if n==1
jac = jacobian_repeat(jf1, op, syms[1:1]; iin=2, iout=2)
elseif n==2
jac = jacobian_repeat(jf2, op, syms[1:2]; iin=2, iout=2)
elseif n==3
jac = jacobian_repeat(jf3, op, syms[1:3]; iin=2, iout=2)
end
println("------ $op ------")
pretty_print_matrix(jac)
end
Expand Down
5 changes: 1 addition & 4 deletions examples/Symbolics/symbolic_utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,7 @@ const SymReal = Sym{Real}
const TermReal = Term{Real}
const SReals = Union{Term{Real}, Sym{Real}}

import NiLang: NEG, INC, DEC, ROT, IROT, FLIP
@inline function NEG(a!::SReals)
-a!
end
import NiLang: INC, DEC, ROT, IROT, FLIP
@inline FLIP(b::Sym{Bool}) = !b

@inline function INC(a!::SReals)
Expand Down
12 changes: 6 additions & 6 deletions examples/batched_tr.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
using NiLang, NiLang.AD
using KernelAbstractions, CuArrays
using KernelAbstractions, CUDA

@i @kernel function kernel_f(A, B::AbstractVector{TB}) where TB
# turng off reversibility check, since GPU can not handle errors
Expand All @@ -9,24 +9,24 @@ using KernelAbstractions, CuArrays
s zero(TB)
# computing
for i in axes(A, 1)
s += identity(A[i, i, batch])
s += A[i, i, batch]
end
B[batch] += identity(s)
B[batch] += s
# deallocate safely
s zero(TB)
batch @index(Global)
end
end

@i function batched_tr!(A::CuArray{T, 3}, B::CuVector{T}) where T
@launchkernel CUDA() 256 length(B) kernel_f(A, B)
@launchkernel CUDADevice() 256 length(B) kernel_f(A, B)
end

A = CuArray(randn(ComplexF32, 10, 10, 100))
B = CuArrays.zeros(ComplexF32, 100)
B = CUDA.zeros(ComplexF32, 100)
A_out, B_out = batched_tr!(A, B)
# put random values in the gradient field of B
grad_B = CuArrays.randn(Float32, 100) + im*CuArrays.randn(Float32, 100)
grad_B = CuArray(randn(ComplexF32, 100))
A_with_g, B_with_g = (~batched_tr!)(GVar(A_out), GVar(B_out, grad_B))
# will see nonzero gradients in complex diagonal parts of A
grad_A = grad(A_with_g |> Array)
31 changes: 14 additions & 17 deletions examples/besselj.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,46 +18,43 @@ using NiLang, NiLang.AD
SWAP(out!, anc!)
end

@i @inline function imul(out!::Int, x::Int, anc!::Int)
anc! += out! * x
out! -= anc! ÷ x
SWAP(out!, anc!)
end

# Here, the definition of SWAP can be found in \App{app:instr}, ``anc! \approx 0`` is a *dirty ancilla*.
# Line 2 computes the result and accumulates it to the dirty ancilla, we get an approximately correct output in **anc!**.
# Line 3 "uncomputes" **out!** approximately by using the information stored in **anc!**, leaving a dirty zero state in register **out!**.
# Line 4 swaps the contents in **out!** and **anc!**.
# Finally, we have an approximately correct output and a dirtier ancilla.
# With this multiplier, we implementation ``J_\nu`` as follows.

@i function ibesselj(out!, ν, z; atol=1e-8)
@i function ibesselj(out!::T, ν, z::T; atol=1e-8) where T
@routine @invcheckoff begin
k 0
fact_nu zero(ν)
halfz zero(z)
halfz_power_nu zero(z)
halfz_power_2 zero(z)
out_anc zero(z)
anc1 zero(z)
anc2 zero(z)
anc3 zero(z)
anc4 zero(z)
anc5 zero(z)
@zeros T halfz halfz_power_nu halfz_power_2 out_anc anc1 anc2 anc3 anc4 anc5
halfz += z / 2
halfz_power_nu += halfz ^ ν
halfz_power_2 += halfz ^ 2
ifactorial(fact_nu, ν)
anc1 += halfz_power_nu/fact_nu
out_anc += identity(anc1)
out_anc += anc1
while (abs(unwrap(anc1)) > atol && abs(unwrap(anc4)) < atol, k!=0)
INC(k)
@routine begin
anc5 += identity(k)
anc5 += identity(ν)
anc5 += k + ν
anc2 -= k * anc5
anc3 += halfz_power_2 / anc2
end
imul(anc1, anc3, anc4)
out_anc += identity(anc1)
out_anc += anc1
~@routine
end
end
out! += identity(out_anc)
out! += out_anc
~@routine
end

Expand All @@ -66,7 +63,7 @@ end
@i function ifactorial(out!, n)
INC(out!)
for i=1:n
mulint(out!, i)
imul(out!, i, 0)
end
end

Expand Down Expand Up @@ -99,7 +96,7 @@ println("The hessian dy^2/dx^2 is $(grad(hxx).partials[1])")

# ```julia
# using CuArrays, GPUArrays, KernelAbstractions
#
#
# @i @kernel function bessel_kernel(out!, v, z)
# @invcheckoff i ← @index(Global)
# ibesselj(out![i], v, z[i])
Expand Down
2 changes: 1 addition & 1 deletion examples/boxmuller.jl
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ using NiLang
_halfsq zero(T)
at += atan(y, x)
if (y < 0, ~)
at += identity(T(2π))
at += T(2π)
end
sq += x ^ 2
sq += y ^ 2
Expand Down
10 changes: 4 additions & 6 deletions examples/fib.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,11 @@ using NiLang
n1 zero(T)
n2 zero(T)
@routine begin
n1 += identity(n)
n1 -= identity(1)
n2 += identity(n)
n2 -= identity(2)
n1 += n - 1
n2 += n - 2
end
if (value(n) <= 2, ~)
out! += identity(1)
out! += 1
else
rfib(out!, n1)
rfib(out!, n2)
Expand All @@ -29,7 +27,7 @@ end
rfib(out, n!)
while (out < z, n! != 0)
~rfib(out, n!)
n! += identity(1)
n! += 1
rfib(out, n!)
end
~rfib(out, n!)
Expand Down
4 changes: 2 additions & 2 deletions examples/nice.jl
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ NiLang.AD.GVar(x::NiceLayer) = NiceLayer(GVar(x.W1), GVar(x.b1), GVar(x.W2), GVa
affine!(layer.y1, layer.W1, layer.b1, x)
@inbounds for i=1:length(layer.y1)
if (layer.y1[i] > 0, ~)
layer.y1a[i] += identity(layer.y1[i])
layer.y1a[i] += layer.y1[i]
end
end
end
Expand All @@ -56,7 +56,7 @@ end
end
end
@invcheckoff for i=1:size(W, 1)
@inbounds y![i] += identity(b[i])
@inbounds y![i] += b[i]
end
end

Expand Down
Loading

0 comments on commit 0477e0f

Please sign in to comment.