-
-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CURA-11761] Cache results of class-member methods #966
Conversation
The usual functools.lru_cache mechanism for cacheing function-results runs into a number of problems when applied to member-methods of class-instances (class-methods are unaffectd by all of those). If relatively easily possible however, use of build-ins is still preferred over alternatives. In this commit, a wrapper around the basic lru_cache is introduced that makes it possible to avoid the main problem that we had with its usage; clearing out the cache would be 'per function', not 'per instance of a class that function belongs to', with the result that if you try to clear out the cache for (all functions of) that instance, you instead clear out the cache belonging to _all_ instances of that class (for all functions). This is because classes in Python are (conceptually at least) 'just' syntactic sugar, and (thus) the self-argument is 'just another argument' to a function that is 'static' of sorts 'underwater'. This wrapper prevents that, by making a partial-application function out of the function and its accociated instance, which in turn allows the lru-cache to 'just work'. A singleton is then provided to access functionality for clearing out the cache for a specific instance. Problems that still need to be solved are a.) garbage-collection of deleted instances, since we now keep a reference to the class (which should be easy) -- and b.) what to do with keyword arguments ('kwargs'), since they get passed to the function-cache as a dictionary, and can't be used as arguments (anymore) because lists, sets and dicts aren't normally hashable (which might prove more of a challenge -- we might have to content with just not cacheing functions that have lists or dicts as arguments, or split those functions up to facilitate cacheing of intermediate results instead). forms the basis for CURA-11761
Not using lru_cache here (expect for class methods), so we can clear the cache per class-instance instead of 'semi-globally' (per function). Splitting up function(s) was done because we still use lru_cache under the hood, and that (and the wrapper at the moment) can't handle dict, list, or set arguements, because those aren't hashable. contributes to CURA-11761
Otherwise, a reference to the object still exists, which may prevent it from being garbage-collected. part of CURA-11761
Database connection wasn't closed, which caused the file to be marked as 'in use' on Windows. sort-of done as part of CURA-11761
Take the cache into account when testing the has-error function. part of CURA-11761
Code that is work in progress has some grody issues sometimes. I'm lucky this was caught by the unit-tests. Deserialize obviously changes the object, and morever shouldn't be cached itself! Also fix if somehow the instance deletion functionality is called twice somehow. (This happened at one point when fixing the tests -- not sure if that'll ever be an issue in the current tests or the actual code, but defensive programming is OK.) part of CURA-11761
core part of CURA-11761
core part of CURA-11761
done as part of CURA-11761
Sets can normally not be hashed, which means we can't use lru_cache or use our own class-cache mechanism that derives from that for methods that have (normal) sets, lists or dicts as argument. Solve that by making some of these methods arguments, the results of which we want to be cached, into frozen sets instead. For this reason, it was convenient to have a method in the SettingDefinition file return a fronzen set as well. The result of which could be cached as well, and then why not do the entire file, since we already need to import the caching mechanisms, and clear the cache on the appropriate moments anyway. part of CURA-11761
Not any other methods, since they're either pyqtproperties (or setters, etc.) and that doesn't seem to play too well with the cacheing mechanism. The properties use this internally, so it's enough anyway. CURA-11761
- accept, but do not attempt to cache, keyword-arguments (kwargs) - make it so that the class doesn't depend on an instance (put the cache directly as a class-variable) part of CURA-11761
part of CURA-11761
part of CURA-11761
done as part of CURA-11761
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor remarks/questions.
I am also a bit surprised that there is no "selective" clearing of the cache. I would have expected the container to clear only the relevant settings values when a source setting changes. This may not be necessary though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm afraid this doesn't fully work as expected: when changing a setting in the UI (e.g. "line width"), the settings that depend on it (e.g. "wall line width") are no more updated.
part of CURA-11761
Changed relations where coming up empty, no relations where handled properly, due to a change to frozen-sets. part of CURA-11761
Copy (shallow) to prevent lists from being altered, because those are references in the cache. part of CURA-11761
part of CURA-11761
Previously, if we had added a __del__ method to any parent later, that might not've been called anymore. done as part of CURA-11761
part of CURA-11761
ab294c9
to
ddd53ff
Compare
Be able to cache results from member-functions of classes (as well as just plain using
lru_cache
for class- and static- methods) and use that to cache a lot of the results from the Container/Settings stacks.Depending on the situation (such as with a printer with multiple extruders in Cura for example) -- this can speed up the interface quite noticibly.
(There's also a small PR in the front-end to make extra sure that objects inheriting from the cached classes play fair. -- )