Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support mixed states #39

Open
PhilipVinc opened this issue Mar 3, 2022 · 2 comments
Open

Support mixed states #39

PhilipVinc opened this issue Mar 3, 2022 · 2 comments

Comments

@PhilipVinc
Copy link

This package is great, and has amazing performance.
However, I'd like to use it to perform noisy simulations, where my state is a density matrix.

This does not work right now, as a matrix is interpreted as a collection of pure states.

Do you have plans to support this use-case in the future?

@Roger-luo
Copy link
Member

Roger-luo commented Mar 3, 2022

OK short version, yes, but it is unlikely that I will work on it before I graduate from my Ph.D. unless there is a project actually needs this functionality as I think this package probably won't lead to any publication (in physics) in the short term. Long version for whoever wants to work on this direction:

First of all, for clarification, this package is still very WIP, and it aims to solve a few problems we have currently inside Yao that needs a complete rework, so not everything works perfectly yet, I won't recommend using it for serious purpose right now, there could be bugs, performance issues that not yet battle-tested. the registered version only contains the major implementation on single-threaded subspace matrix multiplication that has speedup compared to the current stable Yao simulation routines.

the routines in this package are actually not trying to do a collection of pure states, but trying to implement the more fundamental routine subspace matrix multiplication

the Kraus operator evaluation can be done by doing the following

U * (rho * U')
U * (U * rho')'

For the integral version of the Kraus operator, I think you can also benefit from this fundamental numerical routine

\int_s U(s) * rho * U(s)'

since the underlying integrator will also rely on this type of tensor contraction, as the integrator has to implement some kind of discrete steps on top of U(s) * rho * U(s)'.

and we don't actually need to worry about the adjoint operation since the memory layout should be similar because of symmetry.

so this means what matters is how you apply a matrix on density matrix in a subspace, e.g how to do the following tensor contraction efficiently

U_{ai} * U_{bj} * U_{ck} * rho'_{123...a...b...c...n}

now the problem becomes how to accelerate a general tensor contraction that has a pattern of a large tensor A and a bunch of small 2D tensor (matrix) U_i. If this fundamental thing works, then the discrete Kraus operator representation should just work on top of this, no matter what abstraction you try to implement on top of it.

And how to do this kind of tensor contraction efficiently? I have a blog post explaining the simple version: https://blog.rogerluo.dev/2020/03/31/yany/ but to achieve the best performance, one actually need to do more specialization on specific operators, such as Paulis etc.

This means even with the stable routine Yao.instruct!s, you can do quantum channel simulation (of small matrices) efficiently in the same way (in principle!).

But I know there are some potential issues with Yao.instruct!:

  1. the instruct! interface might not work well with Adjoint since we currently transpose the state vector/matrix before feeding them into the implementation to get a continuous memory layout on the batch dimension, so things might not be super clear for devs what is happening
  2. the usage of Tuple as locations in instruct! has over 100 potential ambiguities, so further extending this interface might run into this ambiguity issue add aqua test YaoArrayRegister.jl#85 that is probably too hard to fix on this design

And the current YaoBlocks is certainly not a good place to have the IR support for noisy channels, since it wasn't designed for channels and thus has problems. I hope to address this in the development of YaoCompiler, but it is not currently supported or implemented.

@Roger-luo
Copy link
Member

one addition, by doing more specifications on specific operators means a small compiler will be needed to do this automatically, which is the idea behind that scary metaprogramming in the codegen dir of this package, and I will hope to improve the quality of that code by reworking it using https://github.com/Roger-luo/Expronicon.jl and MLStyle.

Metatheory could also be a good option, but it will depend on how complicated the simplification rules can be as currently it is quite simple.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants