Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improved Index Map Features, Multiply function #42

Merged
merged 11 commits into from
Aug 28, 2024
Merged

Conversation

JoeyT1994
Copy link
Owner

This PR adds some significant changes to the IndexMap and IndexNetworkMap interfaces as well as changes to the multiply() function.

  • rem_index(imap::ComplexIndexMap, ind::Index) and rem_index(imap::RealIndexMap, ind::Index) is now defined to remove indices from the dictionaries inside IndexMaps.
  • dimensions(imap::AbstractIndexMap) is now defined and returns a vector of all the dimensions of the indices in the map. An imap is not required to have a strictly sequential list of dimensions and could just have indices ind with dimension(imap, ind) = 2 and nothing else.
  • reduced_indexmap(imap::AbstractIndexMap, dims::Vector{<:Int}) is defined and returns a new index map with just the indices in the given dimensions
  • merge(imap1::RealIndexMap, imap2::RealIndexMap) and merge(imap1::ComplexIndexMap, imap2::ComplexIndexMap) is defined and creates a new index map based on the merging of all the dictionaries inside the arguments.
  • reduced_indsnetworkmap(inm::IndsNetworkMap, dims::Vector{<:Int}) is defined and returns a new indexnetworkmap based on just the vertices/ indices which are in the corresponding dimension. The resulting graph in indsnetwork(inm) will thus have changed and is that obtained upon removing all vertices which don't have any external indices in dims
  • multiply(f::ITensorNetworkFunction, g::ITensorNetworkFunction) has been re-written so there is no assumption the underlying graphs are the same. Only external indices which are common between f and g are treated as identical and a direct product is taken over everything else. This allows one to take a very general "multiplication" like h(x,y) = f(x) * g(y) where f and g live in completely different dimensions and on different graphs to create a new tensor network which is the merging of the two graphs
  • partial_integrate(fits::ITensorNetworkFunction, dims::Vector{<:Int}; merge_vertices = true) is now defined and integrates only over the specified dimensions. The flag merge_vertices specifies whether to contract away the tensors that were integrated over to create a new ITensorNetworkFunction which lives on a reduced graph.

@ryanlevy The Fredholm example has been updated to reflect the utility of these changes. Now ψ is always reduced to a one-dimensional function following the partial integration. And you can just call evaluate(ψ, x::Number) to evaluate it and it doesn't matter whether you technically ended up in the y or x dimension at the end because ψ only has indices that live in one dimension. Because the Kernel is separable the bond dimension of ψ is always bounded as chi <= 3 (the bond dimension of the kernel g + the bond dimension of the additive function c) in this example without any calls to truncate which I think is really neat and exposes that the code is working well.

Note that I had to overload the add(itn1::ITensorNetwork, itn2::ITensorNetwork) function from ITensorNetworks here because the merge_vertices operation reverses some edges in the network and add currently asserts too strictly that the edges have to be ordered identically to work. This is a bug that needs to be fixed in ITensorNetworks but I put a fix here for now.

@JoeyT1994 JoeyT1994 requested a review from ryanlevy August 6, 2024 21:50
Copy link
Collaborator

@ryanlevy ryanlevy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Joey, this largely looks good! One minor request, could you please add a test for partial_integrate?

src/integration.jl Outdated Show resolved Hide resolved
src/integration.jl Outdated Show resolved Hide resolved
@JoeyT1994
Copy link
Owner Author

Test added.

Copy link
Collaborator

@ryanlevy ryanlevy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now, thanks!

@JoeyT1994 JoeyT1994 merged commit b123d83 into main Aug 28, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants