diff --git a/previews/PR202/.documenter-siteinfo.json b/previews/PR202/.documenter-siteinfo.json index 412f93c1..4ba9b6c4 100644 --- a/previews/PR202/.documenter-siteinfo.json +++ b/previews/PR202/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-13T13:45:21","documenter_version":"1.7.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-13T13:49:43","documenter_version":"1.7.0"}} \ No newline at end of file diff --git a/previews/PR202/alternatives.html b/previews/PR202/alternatives.html index ed07d122..bb07e58d 100644 --- a/previews/PR202/alternatives.html +++ b/previews/PR202/alternatives.html @@ -1,2 +1,2 @@ -Alternatives · Tenet.jl

Alternatives

Tenet is strongly opinionated. We acknowledge that it may not suit all cases (although we try 🙂). If your case doesn't fit Tenet's design, you can try the following libraries:

+Alternatives · Tenet.jl

Alternatives

Tenet is strongly opinionated. We acknowledge that it may not suit all cases (although we try 🙂). If your case doesn't fit Tenet's design, you can try the following libraries:

diff --git a/previews/PR202/ansatz/chain.html b/previews/PR202/ansatz/chain.html index ed523681..bcb63ff8 100644 --- a/previews/PR202/ansatz/chain.html +++ b/previews/PR202/ansatz/chain.html @@ -21,4 +21,4 @@ Label(fig[1,1, Bottom()], "Open") # hide Label(fig[1,2, Bottom()], "Periodic") # hide -fig # hide

In Tenet, the generic MatrixProduct ansatz implements this topology. Type variables are used to address their functionality (State or Operator) and their boundary conditions (Open or Periodic).

Missing docstring.

Missing docstring for MatrixProduct. Check Documenter's build log for details.

Missing docstring.

Missing docstring for MatrixProduct(::Any). Check Documenter's build log for details.

+fig # hide

In Tenet, the generic MatrixProduct ansatz implements this topology. Type variables are used to address their functionality (State or Operator) and their boundary conditions (Open or Periodic).

Missing docstring.

Missing docstring for MatrixProduct. Check Documenter's build log for details.

Missing docstring.

Missing docstring for MatrixProduct(::Any). Check Documenter's build log for details.

diff --git a/previews/PR202/ansatz/product.html b/previews/PR202/ansatz/product.html index d1d14d33..b3d80662 100644 --- a/previews/PR202/ansatz/product.html +++ b/previews/PR202/ansatz/product.html @@ -1,2 +1,2 @@ -Product ansatz · Tenet.jl
+Product ansatz · Tenet.jl
diff --git a/previews/PR202/contraction.html b/previews/PR202/contraction.html index 469e8666..28bfe767 100644 --- a/previews/PR202/contraction.html +++ b/previews/PR202/contraction.html @@ -1,2 +1,2 @@ -Contraction · Tenet.jl

Contraction

Contraction path optimization and execution is delegated to the EinExprs library. A EinExpr is a lower-level form of a Tensor Network, in which the contraction path has been laid out as a tree. It is similar to a symbolic expression (i.e. Expr) but in which every node represents an Einstein summation expression (aka einsum).

EinExprs.einexprMethod
einexpr(tn::AbstractTensorNetwork; optimizer = EinExprs.Greedy, output = inds(tn, :open), kwargs...)

Search a contraction path for the given AbstractTensorNetwork and return it as a EinExpr.

Keyword Arguments

  • optimizer Contraction path optimizer. Check EinExprs documentation for more info.
  • outputs Indices that won't be contracted. Defaults to open indices.
  • kwargs Options to be passed to the optimizer.

See also: contract.

source
Missing docstring.

Missing docstring for contract(::Tenet.TensorNetwork). Check Documenter's build log for details.

+Contraction · Tenet.jl

Contraction

Contraction path optimization and execution is delegated to the EinExprs library. A EinExpr is a lower-level form of a Tensor Network, in which the contraction path has been laid out as a tree. It is similar to a symbolic expression (i.e. Expr) but in which every node represents an Einstein summation expression (aka einsum).

EinExprs.einexprMethod
einexpr(tn::AbstractTensorNetwork; optimizer = EinExprs.Greedy, output = inds(tn, :open), kwargs...)

Search a contraction path for the given AbstractTensorNetwork and return it as a EinExpr.

Keyword Arguments

  • optimizer Contraction path optimizer. Check EinExprs documentation for more info.
  • outputs Indices that won't be contracted. Defaults to open indices.
  • kwargs Options to be passed to the optimizer.

See also: contract.

source
Missing docstring.

Missing docstring for contract(::Tenet.TensorNetwork). Check Documenter's build log for details.

diff --git a/previews/PR202/developer/type-hierarchy.html b/previews/PR202/developer/type-hierarchy.html index 67e8001f..92673add 100644 --- a/previews/PR202/developer/type-hierarchy.html +++ b/previews/PR202/developer/type-hierarchy.html @@ -9,4 +9,4 @@ id3(Ansatz) --> Chain style id1 stroke-dasharray: 5 5 style id2 stroke-dasharray: 5 5 - style id3 stroke-dasharray: 5 5 + style id3 stroke-dasharray: 5 5 diff --git a/previews/PR202/index.html b/previews/PR202/index.html index 1c6d6e75..9bf9305c 100644 --- a/previews/PR202/index.html +++ b/previews/PR202/index.html @@ -2,4 +2,4 @@ Home · Tenet.jl

Tenet.jl

BSC-Quantic's Registry

Tenet and some of its dependencies are located in our own Julia registry. In order to download Tenet, add our registry to your Julia installation by using the Pkg mode in a REPL session,

using Pkg
 pkg"registry add https://github.com/bsc-quantic/Registry"

A Julia library for Tensor Networks. Tenet can be executed both at local environments and on large supercomputers. Its goals are,

  • Expressiveness Simple to use 👶
  • Flexibility Extend it to your needs 🔧
  • Performance Goes brr... fast 🏎️

A video of its presentation at JuliaCon 2023 can be seen here:

-

Features

  • Optimized Tensor Network contraction, powered by EinExprs
  • Tensor Network slicing/cuttings
  • Automatic Differentiation of TN contraction, powered by EinExprs and ChainRules
  • 3D visualization of large networks, powered by Makie
+

Features

diff --git a/previews/PR202/quantum.html b/previews/PR202/quantum.html index fc21987c..7b40942f 100644 --- a/previews/PR202/quantum.html +++ b/previews/PR202/quantum.html @@ -1,2 +1,2 @@ -Introduction · Tenet.jl

Quantum Tensor Networks

Tenet.QuantumType
Quantum

Tensor Network with a notion of "causality". This leads to the notion of sites and directionality (input/output).

Notes

  • Indices are referenced by Sites.
source
Base.adjointMethod
adjoint(q::Quantum)

Returns the adjoint of a Quantum Tensor Network; i.e. the conjugate Tensor Network with the inputs and outputs swapped.

source

Queries

Missing docstring.

Missing docstring for Tenet.inds(::Quantum; kwargs...). Check Documenter's build log for details.

Missing docstring.

Missing docstring for Tenet.tensors(::Quantum; kwargs...). Check Documenter's build log for details.

Connecting Quantum Tensor Networks

Missing docstring.

Missing docstring for inputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for outputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for lanes. Check Documenter's build log for details.

Missing docstring.

Missing docstring for ninputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for noutputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for nlanes. Check Documenter's build log for details.

Missing docstring.

Missing docstring for Socket. Check Documenter's build log for details.

Tenet.ScalarType
Scalar <: Socket

Socket representing a scalar; i.e. a Tensor Network with no open sites.

source
Tenet.StateType
State <: Socket

Socket representing a state; i.e. a Tensor Network with only input sites (or only output sites if dual = true).

source
Tenet.OperatorType
Operator <: Socket

Socket representing an operator; i.e. a Tensor Network with both input and output sites.

source
Base.mergeMethod
merge(a::Quantum, b::Quantum...)

Merges multiple Quantum Tensor Networks into a single one by connecting input/output sites.

source
+Introduction · Tenet.jl

Quantum Tensor Networks

Tenet.QuantumType
Quantum

Tensor Network with a notion of "causality". This leads to the notion of sites and directionality (input/output).

Notes

  • Indices are referenced by Sites.
source
Base.adjointMethod
adjoint(q::Quantum)

Returns the adjoint of a Quantum Tensor Network; i.e. the conjugate Tensor Network with the inputs and outputs swapped.

source

Queries

Missing docstring.

Missing docstring for Tenet.inds(::Quantum; kwargs...). Check Documenter's build log for details.

Missing docstring.

Missing docstring for Tenet.tensors(::Quantum; kwargs...). Check Documenter's build log for details.

Connecting Quantum Tensor Networks

Missing docstring.

Missing docstring for inputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for outputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for lanes. Check Documenter's build log for details.

Missing docstring.

Missing docstring for ninputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for noutputs. Check Documenter's build log for details.

Missing docstring.

Missing docstring for nlanes. Check Documenter's build log for details.

Missing docstring.

Missing docstring for Socket. Check Documenter's build log for details.

Tenet.ScalarType
Scalar <: Socket

Socket representing a scalar; i.e. a Tensor Network with no open sites.

source
Tenet.StateType
State <: Socket

Socket representing a state; i.e. a Tensor Network with only input sites (or only output sites if dual = true).

source
Tenet.OperatorType
Operator <: Socket

Socket representing an operator; i.e. a Tensor Network with both input and output sites.

source
Base.mergeMethod
merge(a::Quantum, b::Quantum...)

Merges multiple Quantum Tensor Networks into a single one by connecting input/output sites.

source
diff --git a/previews/PR202/references.html b/previews/PR202/references.html index e76f61ef..346a262c 100644 --- a/previews/PR202/references.html +++ b/previews/PR202/references.html @@ -1,2 +1,2 @@ -References · Tenet.jl

References

  • Fishman, M.; White, S. R. and Stoudenmire, E. M. (2022). The ITensor Software Library for Tensor Network Calculations. SciPost Phys. Codebases, 4.
  • Gray, J. (2018), quimb: A python package for quantum information and many-body calculations. Journal of Open Source Software 3, 819.
  • Gray, J. and Kourtis, S. (2021). Hyper-optimized tensor network contraction. Quantum 5, 410.
  • Hauschild, J.; Pollmann, F. and Zaletel, M. (2021). The Tensor Network Python (TeNPy) Library. In: APS March Meeting Abstracts, Vol. 2021; p. R21–006.
  • Ramón Pareja Monturiol, J.; Pérez-García, D. and Pozas-Kerstjens, A. (2023). TensorKrowch: Smooth integration of tensor networks in machine learning, arXiv e-prints, arXiv–2306.
+References · Tenet.jl

References

  • Fishman, M.; White, S. R. and Stoudenmire, E. M. (2022). The ITensor Software Library for Tensor Network Calculations. SciPost Phys. Codebases, 4.
  • Gray, J. (2018), quimb: A python package for quantum information and many-body calculations. Journal of Open Source Software 3, 819.
  • Gray, J. and Kourtis, S. (2021). Hyper-optimized tensor network contraction. Quantum 5, 410.
  • Hauschild, J.; Pollmann, F. and Zaletel, M. (2021). The Tensor Network Python (TeNPy) Library. In: APS March Meeting Abstracts, Vol. 2021; p. R21–006.
  • Ramón Pareja Monturiol, J.; Pérez-García, D. and Pozas-Kerstjens, A. (2023). TensorKrowch: Smooth integration of tensor networks in machine learning, arXiv e-prints, arXiv–2306.
diff --git a/previews/PR202/tensor-network.html b/previews/PR202/tensor-network.html index 636f73cf..54ce6f9d 100644 --- a/previews/PR202/tensor-network.html +++ b/previews/PR202/tensor-network.html @@ -2,8 +2,8 @@ Tensor Networks · Tenet.jl

Tensor Networks

Tensor Networks (TN) are a graphical notation for representing complex multi-linear functions. For example, the following equation

\[\sum_{ijklmnop} A_{im} B_{ijp} C_{njk} D_{pkl} E_{mno} F_{ol}\]

can be represented visually as

Sketch of a Tensor Network
Sketch of a Tensor Network
-

The graph's nodes represent tensors and edges represent tensor indices.

In Tenet, these objects are represented by the TensorNetwork type.

Tenet.TensorNetworkType
TensorNetwork

Graph of interconnected tensors, representing a multilinear equation. Graph vertices represent tensors and graph edges, tensor indices.

source

Information about a TensorNetwork can be queried with the following functions.

Query information

Missing docstring.

Missing docstring for inds(::Tenet.TensorNetwork). Check Documenter's build log for details.

Base.sizeMethod
size(tn::TensorNetwork)
-size(tn::TensorNetwork, index)

Return a mapping from indices to their dimensionalities.

If index is set, return the dimensionality of index. This is equivalent to size(tn)[index].

source
Missing docstring.

Missing docstring for tensors(::Tenet.TensorNetwork). Check Documenter's build log for details.

Modification

Add/Remove tensors

Base.append!Method
append!(tn::TensorNetwork, tensors::AbstractVecOrTuple{<:Tensor})

Add a list of tensors to a TensorNetwork.

See also: push!, merge!.

source
Base.merge!Method
merge!(self::TensorNetwork, others::TensorNetwork...)
-merge(self::TensorNetwork, others::TensorNetwork...)

Fuse various TensorNetworks into one.

See also: append!.

source
Base.pop!Method
pop!(tn::TensorNetwork, tensor::Tensor)
-pop!(tn::TensorNetwork, i::Union{Symbol,AbstractVecOrTuple{Symbol}})

Remove a tensor from the Tensor Network and returns it. If a Tensor is passed, then the first tensor satisfies egality (i.e. or ===) will be removed. If a Symbol or a list of Symbols is passed, then remove and return the tensors that contain all the indices.

See also: push!, delete!.

source

Replace existing elements

Base.replace!Function
replace!(tn::AbstractTensorNetwork, old => new...)
-replace(tn::AbstractTensorNetwork, old => new...)

Replace the element in old with the one in new. Depending on the types of old and new, the following behaviour is expected:

  • If Symbols, it will correspond to a index renaming.
  • If Tensors, first element that satisfies egality ( or ===) will be replaced.
source

Slicing

Miscelaneous

Base.randMethod
rand(TensorNetwork, n::Integer, regularity::Integer; out = 0, dim = 2:9, seed = nothing, globalind = false)

Generate a random tensor network.

Arguments

  • n Number of tensors.
  • regularity Average number of indices per tensor.
  • out Number of open indices.
  • dim Range of dimension sizes.
  • seed If not nothing, seed random generator with this value.
  • globalind Add a global 'broadcast' dimension to every tensor.
source
+

The graph's nodes represent tensors and edges represent tensor indices.

In Tenet, these objects are represented by the TensorNetwork type.

Tenet.TensorNetworkType
TensorNetwork

Graph of interconnected tensors, representing a multilinear equation. Graph vertices represent tensors and graph edges, tensor indices.

source

Information about a TensorNetwork can be queried with the following functions.

Query information

Missing docstring.

Missing docstring for inds(::Tenet.TensorNetwork). Check Documenter's build log for details.

Base.sizeMethod
size(tn::TensorNetwork)
+size(tn::TensorNetwork, index)

Return a mapping from indices to their dimensionalities.

If index is set, return the dimensionality of index. This is equivalent to size(tn)[index].

source
Missing docstring.

Missing docstring for tensors(::Tenet.TensorNetwork). Check Documenter's build log for details.

Modification

Add/Remove tensors

Base.push!Method
push!(tn::TensorNetwork, tensor::Tensor)

Add a new tensor to the Tensor Network.

See also: append!, pop!.

source
Base.append!Method
append!(tn::TensorNetwork, tensors::AbstractVecOrTuple{<:Tensor})

Add a list of tensors to a TensorNetwork.

See also: push!, merge!.

source
Base.merge!Method
merge!(self::TensorNetwork, others::TensorNetwork...)
+merge(self::TensorNetwork, others::TensorNetwork...)

Fuse various TensorNetworks into one.

See also: append!.

source
Base.pop!Method
pop!(tn::TensorNetwork, tensor::Tensor)
+pop!(tn::TensorNetwork, i::Union{Symbol,AbstractVecOrTuple{Symbol}})

Remove a tensor from the Tensor Network and returns it. If a Tensor is passed, then the first tensor satisfies egality (i.e. or ===) will be removed. If a Symbol or a list of Symbols is passed, then remove and return the tensors that contain all the indices.

See also: push!, delete!.

source
Base.delete!Method
delete!(tn::TensorNetwork, x)

Like pop! but return the TensorNetwork instead.

source

Replace existing elements

Base.replace!Function
replace!(tn::AbstractTensorNetwork, old => new...)
+replace(tn::AbstractTensorNetwork, old => new...)

Replace the element in old with the one in new. Depending on the types of old and new, the following behaviour is expected:

  • If Symbols, it will correspond to a index renaming.
  • If Tensors, first element that satisfies egality ( or ===) will be replaced.
source

Slicing

Base.selectdimFunction
selectdim(tn::AbstractTensorNetwork, index::Symbol, i)

Return a copy of the AbstractTensorNetwork where index has been projected to dimension i.

See also: view, slice!.

source
Tenet.slice!Function
slice!(tn::AbstractTensorNetwork, index::Symbol, i)

In-place projection of index on dimension i.

See also: selectdim, view.

source
Base.viewMethod
view(tn::AbstractTensorNetwork, index => i...)

Return a copy of the AbstractTensorNetwork where each index has been projected to dimension i. It is equivalent to a recursive call of selectdim.

See also: selectdim, slice!.

source

Miscelaneous

Base.copyMethod
copy(tn::TensorNetwork)

Return a shallow copy of a TensorNetwork.

source
Base.randMethod
rand(TensorNetwork, n::Integer, regularity::Integer; out = 0, dim = 2:9, seed = nothing, globalind = false)

Generate a random tensor network.

Arguments

  • n Number of tensors.
  • regularity Average number of indices per tensor.
  • out Number of open indices.
  • dim Range of dimension sizes.
  • seed If not nothing, seed random generator with this value.
  • globalind Add a global 'broadcast' dimension to every tensor.
source
diff --git a/previews/PR202/tensors.html b/previews/PR202/tensors.html index ff7f3faa..ac710862 100644 --- a/previews/PR202/tensors.html +++ b/previews/PR202/tensors.html @@ -1,11 +1,11 @@ Tensors · Tenet.jl

Tensors

There are many jokes[1] about how to define a tensor. The definition we are giving here might not be the most correct one, but it is good enough for our use case (don't kill me please, mathematicians). A tensor $T$ of order[2] $n$ is a multilinear[3] application between $n$ vector spaces over a field $\mathcal{F}$.

\[T : \mathcal{F}^{\dim(1)} \times \dots \times \mathcal{F}^{\dim(n)} \mapsto \mathcal{F}\]

In layman's terms, it is a linear function whose inputs are vectors and the output is a scalar number.

\[T(\mathbf{v}^{(1)}, \dots, \mathbf{v}^{(n)}) = c \in \mathcal{F} \qquad\qquad \forall i, \mathbf{v}^{(i)} \in \mathcal{F}^{\dim(i)}\]

Tensor algebra is a higher-order generalization of linear algebra, where scalar numbers can be viewed as order-0 tensors, vectors as order-1 tensors, matrices as order-2 tensors, ...

Letters are used to identify each of the vector spaces the tensor relates to. In computer science, you would intuitively think of tensors as "n-dimensional arrays with named dimensions".

\[T_{ijk} \iff \mathtt{T[i,j,k]}\]

The Tensor type

In Tenet, a tensor is represented by the Tensor type, which wraps an array and a list of symbols. As it subtypes AbstractArray, many array operations can be dispatched to it.

You can create a Tensor by passing an array and a list of Symbols that name indices.

julia> Tᵢⱼₖ = Tensor(rand(3,5,2), (:i,:j,:k))3×5×2 Tensor{Float64, 3, Array{Float64, 3}}:
 [:, :, 1] =
- 0.985068  0.558129  0.0799515  0.862242  0.0251388
- 0.444213  0.157252  0.445862   0.256949  0.786405
- 0.76127   0.858925  0.450057   0.741249  0.484926
+ 0.172231  0.421248  0.398768  0.390864  0.353619
+ 0.53548   0.826024  0.238508  0.321862  0.824088
+ 0.956615  0.890798  0.189214  0.744979  0.404053
 
 [:, :, 2] =
- 0.346143  0.618788    0.787786  0.790776  0.352801
- 0.577365  0.694739    0.890417  0.128751  0.996148
- 0.945546  0.00941188  0.368786  0.868031  0.0517156

The dimensionality or size of each index can be consulted using the size function.

Base.sizeMethod
Base.size(::Tensor[, i])

Return the size of the underlying array or the dimension i (specified by Symbol or Integer).

source
julia> size(Tᵢⱼₖ)(3, 5, 2)
julia> size(Tᵢⱼₖ, :j)5
julia> length(Tᵢⱼₖ)30

Operations

Contraction

Tenet.contractMethod
contract(a::Tensor[, b::Tensor]; dims=nonunique([inds(a)..., inds(b)...]))

Perform tensor contraction operation.

source

Factorizations

LinearAlgebra.svdMethod
LinearAlgebra.svd(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform SVD factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the SVD factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the SVD factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
LinearAlgebra.qrMethod
LinearAlgebra.qr(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform QR factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the QR factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the QR factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
LinearAlgebra.luMethod
LinearAlgebra.lu(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform LU factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the LU factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the LU factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
  • 1For example, recursive definitions like a tensor is whatever that transforms as a tensor.
  • 2The order of a tensor may also be known as rank or dimensionality in other fields. However, these can be missleading, since it has nothing to do with the rank of linear algebra nor with the dimensionality of a vector space. We prefer to use word order.
  • 3Meaning that the relationships between the output and the inputs, and the inputs between them, are linear.
+ 0.110167 0.298797 0.29506 0.521934 0.631884 + 0.242917 0.687036 0.667437 0.89848 0.508089 + 0.562336 0.464819 0.956867 0.266979 0.205706

The dimensionality or size of each index can be consulted using the size function.

Base.sizeMethod
Base.size(::Tensor[, i])

Return the size of the underlying array or the dimension i (specified by Symbol or Integer).

source
julia> size(Tᵢⱼₖ)(3, 5, 2)
julia> size(Tᵢⱼₖ, :j)5
julia> length(Tᵢⱼₖ)30

Operations

Contraction

Tenet.contractMethod
contract(a::Tensor[, b::Tensor]; dims=nonunique([inds(a)..., inds(b)...]))

Perform tensor contraction operation.

source

Factorizations

LinearAlgebra.svdMethod
LinearAlgebra.svd(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform SVD factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the SVD factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the SVD factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
LinearAlgebra.qrMethod
LinearAlgebra.qr(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform QR factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the QR factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the QR factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
LinearAlgebra.luMethod
LinearAlgebra.lu(tensor::Tensor; left_inds, right_inds, virtualind, kwargs...)

Perform LU factorization on a tensor.

Keyword arguments

  • left_inds: left indices to be used in the LU factorization. Defaults to all indices of t except right_inds.
  • right_inds: right indices to be used in the LU factorization. Defaults to all indices of t except left_inds.
  • virtualind: name of the virtual bond. Defaults to a random Symbol.
source
diff --git a/previews/PR202/transformations-7a040005.png b/previews/PR202/transformations-7a040005.png deleted file mode 100644 index d19ba947..00000000 Binary files a/previews/PR202/transformations-7a040005.png and /dev/null differ diff --git a/previews/PR202/transformations-ee594fa5.png b/previews/PR202/transformations-ee594fa5.png new file mode 100644 index 00000000..f20a6f39 Binary files /dev/null and b/previews/PR202/transformations-ee594fa5.png differ diff --git a/previews/PR202/transformations.html b/previews/PR202/transformations.html index b8963748..56964829 100644 --- a/previews/PR202/transformations.html +++ b/previews/PR202/transformations.html @@ -1,4 +1,4 @@ Transformations · Tenet.jl

Transformations

In tensor network computations, it is good practice to apply various transformations to simplify the network structure, reduce computational cost, or prepare the network for further operations. These transformations modify the network's structure locally by permuting, contracting, factoring or truncating tensors.

A crucial reason why these methods are indispensable lies in their ability to drastically reduce the problem size of the contraction path search and also the contraction. This doesn't necessarily involve reducing the maximum rank of the Tensor Network itself, but more importantly, it reduces the size (or rank) of the involved tensors.

Our approach is based in (Gray and Kourtis, 2021), which can also be found in quimb.

In Tenet, we provide a set of predefined transformations which you can apply to your TensorNetwork using both the transform/transform! functions.

Tenet.transformFunction
transform(tn::TensorNetwork, config::Transformation)
-transform(tn::TensorNetwork, configs)

Return a new TensorNetwork where some Transformation has been performed into it.

See also: transform!.

source
Tenet.transform!Function
transform!(tn::TensorNetwork, config::Transformation)
-transform!(tn::TensorNetwork, configs)

In-place version of transform.

source

Available transformations

Hyperindex converter

Tenet.HyperFlattenType
HyperFlatten <: Transformation

Convert hyperindices to COPY-tensors, represented by DeltaArrays. This transformation is always used by default when visualizing a TensorNetwork with plot.

See also: HyperGroup.

source

Contraction simplification

Example block output

Diagonal reduction

Tenet.DiagonalReductionType
DiagonalReduction <: Transformation

Reduce the dimension of a Tensor in a TensorNetwork when it has a pair of indices that fulfil a diagonal structure.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
source
Example block output

Anti-diagonal reduction

Tenet.AntiDiagonalGaugingType
AntiDiagonalGauging <: Transformation

Reverse the order of tensor indices that fulfill the anti-diagonal condition. While this transformation doesn't directly enhance computational efficiency, it sets up the TensorNetwork for other operations that do.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
  • skip List of indices to skip. Defaults to [].
source

Dimension truncation

Tenet.TruncateType
Truncate <: Transformation

Truncate the dimension of a Tensor in a TensorNetwork when it contains columns with all elements smaller than atol.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
  • skip List of indices to skip. Defaults to [].
source
Example block output

Split simplification

Tenet.SplitSimplificationType
SplitSimplification <: Transformation

Reduce the rank of tensors in the TensorNetwork by decomposing them using the Singular Value Decomposition (SVD). Tensors whose factorization do not increase the maximum rank of the network are left decomposed.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-10.
source
Example block output
+transform(tn::TensorNetwork, configs)

Return a new TensorNetwork where some Transformation has been performed into it.

See also: transform!.

source
Tenet.transform!Function
transform!(tn::TensorNetwork, config::Transformation)
+transform!(tn::TensorNetwork, configs)

In-place version of transform.

source

Available transformations

Hyperindex converter

Tenet.HyperFlattenType
HyperFlatten <: Transformation

Convert hyperindices to COPY-tensors, represented by DeltaArrays. This transformation is always used by default when visualizing a TensorNetwork with plot.

See also: HyperGroup.

source
Tenet.HyperGroupType
HyperGroup <: Transformation

Convert COPY-tensors, represented by DeltaArrays, to hyperindices.

See also: HyperFlatten.

source

Contraction simplification

Tenet.ContractSimplificationType
ContractSimplification <: Transformation

Preemptively contract tensors whose result doesn't increase in size.

source
Example block output

Diagonal reduction

Tenet.DiagonalReductionType
DiagonalReduction <: Transformation

Reduce the dimension of a Tensor in a TensorNetwork when it has a pair of indices that fulfil a diagonal structure.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
source
Example block output

Anti-diagonal reduction

Tenet.AntiDiagonalGaugingType
AntiDiagonalGauging <: Transformation

Reverse the order of tensor indices that fulfill the anti-diagonal condition. While this transformation doesn't directly enhance computational efficiency, it sets up the TensorNetwork for other operations that do.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
  • skip List of indices to skip. Defaults to [].
source

Dimension truncation

Tenet.TruncateType
Truncate <: Transformation

Truncate the dimension of a Tensor in a TensorNetwork when it contains columns with all elements smaller than atol.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-12.
  • skip List of indices to skip. Defaults to [].
source
Example block output

Split simplification

Tenet.SplitSimplificationType
SplitSimplification <: Transformation

Reduce the rank of tensors in the TensorNetwork by decomposing them using the Singular Value Decomposition (SVD). Tensors whose factorization do not increase the maximum rank of the network are left decomposed.

Keyword Arguments

  • atol Absolute tolerance. Defaults to 1e-10.
source
Example block output diff --git a/previews/PR202/visualization.html b/previews/PR202/visualization.html index 969d1c4f..cf7b857e 100644 --- a/previews/PR202/visualization.html +++ b/previews/PR202/visualization.html @@ -1,4 +1,4 @@ Visualization · Tenet.jl

Visualization

Tenet provides a Package Extension for Makie support. You can just import a Makie backend and call GraphMakie.graphplot on a TensorNetwork.

GraphMakie.graphplotMethod
graphplot(tn::TensorNetwork; kwargs...)
 graphplot!(f::Union{Figure,GridPosition}, tn::TensorNetwork; kwargs...)
-graphplot!(ax::Union{Axis,Axis3}, tn::TensorNetwork; kwargs...)

Plot a TensorNetwork as a graph.

Keyword Arguments

  • labels If true, show the labels of the tensor indices. Defaults to false.
  • The rest of kwargs are passed to GraphMakie.graphplot.
source
graphplot(tn, layout=Stress(), labels=true)
Example block output
+graphplot!(ax::Union{Axis,Axis3}, tn::TensorNetwork; kwargs...)

Plot a TensorNetwork as a graph.

Keyword Arguments

source
graphplot(tn, layout=Stress(), labels=true)
Example block output