-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to handle introduction of Python extension support #40
Comments
@speth ... thanks for creating this issue! Before going to the step of eliminating
|
This is somewhat disappointing, because it means that we have to totally recompile the library for every version of Python we support, right? That's going to be a significant increase of CI resources. Is there any way to decouple them? On the other hand, Conda-forge just has one job for each build in the matrix, and they all run in parallel, so at least over there, we aren't getting the benefit of the decoupling. I suspect the easiest, least-time-consuming way to support this is to drop the
Is that feasible, or am I misunderstanding what the problem is, especially around
If we eliminate
I think if these packages require Python to work, then there's no reason not to package them all up together, especially if it makes maintenance easier... |
|
What you're asking for here is significantly more complicated than what I have currently implemented. Right now, we don't rely dynamically searching for libraries at runtime -- the linkage to the Python library has to be worked out at compile time, even if we're linking to a dynamic/shared library. While this is technically possible (it is, after all, what Python has to do to load compiled modules), it's not something I have experience with, and I suspect that there is a lot of platform-specific mumbo jumbo. I had sort of been hoping to save dealing with this complexity for later, when implementing extension support for a different language.
No, you only need to recompile two files and relink to the correct Python library. The added CI time should be negligible (assuming it's done correctly).
If the |
@speth and @bryanwweber ... given the circumstances, combining compiled code into a single package appears to be a reasonable approach. Caveat: there's no good solution for Regarding |
@speth ... thanks for looking into fixing the broken runners. Now that the latest action passes (:tada:), what are your thoughts here? Since the last post, Cantera became dynamically linkable (thanks again for the heavy lifting on that one), so did any of what was mentioned above change? |
That is what I am working on figuring out. My expectation is that the changes in this PR will not be necessary due to the work on shared library support. Before closing this, though, I want to test the resulting packages on a couple of different machines to make sure they actually work. |
The improved way of handling Python extensibility using Boost.DLL renders these changes unnecessary. |
The conda recipes need some updates to handle the addition of support for calling user-defined Python extensions from C++ (see Cantera/cantera#1382).
Currently, in the conda recipe, the shared library is built with the Python module disabled, but that is now also what determines whether support for Python-based extensions is compiled, since the requirements for Python extension support are the same as those for the Python module itself, and using the Python extension support requires the Python module to be present in any case. Further,
libcantera
now depends on a specific version oflibpython
, so we can't provide just one version oflibcantera
for all Python versions.As discussed in Cantera/cantera#1382, it might make sense to just combine the
libcantera
package with the Python module, since the requirements for the two are now identical, and the first can only be fully used if the second is installed.The text was updated successfully, but these errors were encountered: