-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lazy Archive [FEATURE REQUEST] #469
Comments
Hi @gresavage, thanks for making this suggestion! Your use case seems valid. I think there may be a lot of small obstacles to overcome, but it may be manageable to have lazy classes for each category. One thing is that we would want the lazy archive to behave identically to the original archives after the first add(); a user should not have to change their entire code just to make the initialization a bit easier. As such, I wonder if it is possible to essentially set Small nit-pick: |
@btjanaka after thinking about this more, instead of if not self._has_init:
...
# stuff to infer dimensions
...
self.add = self._base_archive.add
return self.add(...) # this should now point to the correct `add` routine
# this codepath should never be reached
Raise RuntimeError("some informative message") Sorry, I wrote this based on PyRibs 0.6.4, the 0.7 release snuck in without me noticing! I will have to think some more about how what to do for the lazy classes for each category/init signature. |
The one issue I see is that there are a bunch of methods to replace for each archive, and not all are shared across all the archives. There must be some hacky thing we can do in Python, like "once we receive the call to add, transform this class's API to match that of the What do you mean by replacing |
Description
Allow for the ability to "lazy init" an archive where
solution_dim
andmeasure_dim
are inferred from the first call toadd
Use Case
In the case of a complicated or highly dynamic program where the structure of the archive is not definitively known ahead of time this can be useful. For example, say I have a program which builds the ANN topology for a reinforcement learning algorithm... since it is common in RL to use the same algorithm on a wide variety of environments, the shape of the network input is highly dynamic. I (the user) currently have to delay initialization of the archive until the size of the flattened network parameters are known.
Furthermore, it is common to use convolutional nets for measure encoding. The process of determining the output size of a CNN can be rather tedious. Allowing for "lazy" archive initialization (analogous to PyTorch's lazy modules, e.g. LazyLinear) eases these issues.
Snippet
Aptly noted by @btjanaka in #468 this requires initialization in a separate method. Below is an example of a class which naively implements this:
Additionally, a
LazyEmitter
andLazyScheduler
would have to be implemented in a similar fashion in order to handle the fact that the__init__
methods of those classes rely on attributes likesolution_dim
.The text was updated successfully, but these errors were encountered: