Skip to content
This repository has been archived by the owner on Dec 18, 2021. It is now read-only.

Batched block #38

Open
GiggleLiu opened this issue May 4, 2019 · 4 comments
Open

Batched block #38

GiggleLiu opened this issue May 4, 2019 · 4 comments
Labels
enhancement New feature or request

Comments

@GiggleLiu
Copy link
Member

GiggleLiu commented May 4, 2019

We have already got batched register for solving the sampling problem.

Also, I find pretty useful to have batched blocks, especially in simulating kernel quantum algorithms in

Schuld, Maria, and Nathan Killoran. "Quantum machine learning in feature Hilbert spaces." Physical review letters 122.4 (2019): 040504.

A possible prototype

using Yao

struct Batched{N, T, BT<:AbstractBlock{N, T}} <: CompositeBlock{N, T}
    blocks::Vector{<:BT}
end

Base.broadcastable(bb::Batched) = bb.blocks

function Yao.apply!(reg::AbstractRegister{B}, bb::Batched) where B
    B == length(bb.blocks) || error()
    Yao.apply!.(reg, bb)
    reg
end

zero_state(4, nbatch=10) |> bb |> state

The problem is how do we dispatch a Batched block? Or just disallow dispatch! a scalar for such a block?

@Roger-luo
Copy link
Member

I think this is similar to what I proposed "Array" block before.

@GiggleLiu
Copy link
Member Author

Let's add it after summer school and add some beautiful examples!

@Roger-luo Roger-luo added the enhancement New feature or request label May 5, 2019
@Roger-luo
Copy link
Member

  1. I don't think we should let it be broadcastable, having two similar things apply! and apply!. is dangerous, especially when a simple for loop will do the job.
  2. I was think about share type annotation, so the a batch of same block with parameters are contiguous on memory which improves cache efficiency and simplifies the update process of variational circuits, and it will fallback to the only static type Any which will not hoist the type annotation.

@GiggleLiu
Copy link
Member Author

Just do it in a simple way. But please notice, apply!. gives you the reasonable result if and only if Batched is broadcatable.

Anyway, let’s not discuss good or bad before implementing one.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants