Structure Tensor Networks (strucTN): Advantages of density in tensor network geometries for gradient based training
Public release of code, v0.9
The scripts in "scripts" will use the methods in "mps_compression" and "tree_tensors" training different structures to store the information of random states. As an example, one would write:
python struct_complexity.py 0 50 "full" "cpu"
Where the first argument "0" is the batch (which sizes of structures to run, choose between 0 and 5 or edit in the script), the second argument "1" the amount of repetitions (it is recommended to do several repetitions as there is statistical variability in the training), "full" decides the bond dimension used (choose between "small", "reduced" and "full", or edit the script) and finally with the last argument "cpu" one can run the training part of the code on CPUs or GPUs (choose "gpu" only if there is one available in your system and the proper drivers and libraries have been installed).
We plan on expanding the "topologies" folder to have classes for the interesting tree TN, although this progress is stopped as we decide whether to import this code to Julia and integrate it with Tenet (https://github.com/bsc-quantic/Tenet.jl).
The data_notebook.ipynb file is an example in notebook format on with the code lines to reproduce figures like those in the main text of https://arxiv.org/abs/2412.17497, although first the scripts need to be run to generate appropriate data, and then move them to a folder accessible by the notebook.