Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add parallel-safe LRU cache and global cache manager objects #703

Closed
wants to merge 2 commits into from

Conversation

connorjward
Copy link
Collaborator

There are occasions in PyOP2/Firedrake where we repeatedly perform expensive operations but are unable to cache them because:

  • No suitable object exists upon which the object may be cached, and
  • The object references "large" data structures

A classic example of this is something like computing the adjoint of a form. Forms are often non-persistent (e.g. solve(u*v*dx == f*v*dx, ...)) and they hold references to the coefficients so a global cache is not suitable. Computing adjoint is a significant cost when computing the derivative of adjoint problems so caching results is desirable.

This PR introduces parallel-safe LRU caches. I believe that this would address the problems described above.

I have also introduced the concept of a "cache manager" (#693) to facilitate cache cleaning and inspection.

Closes #696 #693

TODOs:

  • Code cleanup and documentation
  • Write tests
  • Enable configuration of cache sizes via environment variables - users with memory constraints would want smaller caches




class CacheManager(dict):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JDBetteridge: we should have pretty printing of cache statistics.


def __init__(self):
super().__init__()
self._caches = defaultdict(CountedNoEvictCache)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need to tie these caches to the communicators so they are cleaned up at the right time (in case we spew lots of communicators and leak that way).

@connorjward
Copy link
Collaborator Author

Closing as @JDBetteridge has already merged a much nicer version of what I was trying to accomplish.

@connorjward connorjward deleted the connorjward/cache-extras branch October 22, 2024 15:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add LRU cache
1 participant