Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] Approach for system requirements #641

Open
danimtb opened this issue Jan 10, 2020 · 18 comments
Open

[question] Approach for system requirements #641

danimtb opened this issue Jan 10, 2020 · 18 comments
Labels
Docs help wanted Need some help to be finish priority: medium question Further information is requested

Comments

@danimtb
Copy link
Member

danimtb commented Jan 10, 2020

There are some recipes that may need to install tools that can be considered as system requirements.

There are already some recipes in the index that check for system tools installed but do not install them. Others like libusb do install system requirements as libudev.

We feel it is intrusive and risky to install system requirements when using a Conan dependency and we want to avoid this in Conan (specially asking for sudo permissions 😩).

The question here is what approach should be taken when packaging recipes that require any kind os system requirement:

  • Should we create recipes for system requirements and package binaries?
  • Should we just check the existence of that tools installed and just raise?
  • Should we create "proxy recipes" that perform those install or checks to model that dependency in the conan graph?

Feedback please!

@danimtb danimtb added question Further information is requested help wanted Need some help to be finish priority: medium labels Jan 10, 2020
@SSE4
Copy link
Contributor

SSE4 commented Jan 10, 2020

for the reference, I am copying here problems/disadvantages/limitations caused by system requirements we previously faced on bincrafters:


System Requirements in conan known to cause lots of various issues, they are hard to maintain, and don't scale well. some problems with System Requirements:

  • they require running system package manager from sudo, which is not a desired in many cases. they pollute build machine, they require user interaction, and they require permissions which developer might not have at all.
  • system requirements don't support cross-building at all, especially cases like cross-compiling from Windows/Mac to Linux are impossible, and even Linux -> Linux is problematic.
  • due to the fragmentation, there are many major package managers to support, like yum, apt, pacman, etc. support of each new package manager requires additional baby-sitting (and possibly modifications to the conan itself, to add new distributions like Gentoo, Slackware, etc.)
  • within single package manager, there are various architectures to support - and this adds an additional level of scalability problem.
  • even within single package manager, various distributions may have different names of packages. for instance, Fedora, RHEL and CentOS are known to have different package names for certain packages.
  • even within single distro, packages are deprecated and removed from time to time, so it requires continuous maintenance with every new release.
  • the code to use system package manager in recipes looks ugly - it has lots of nested conditions.
  • due to the various bugs in distributions, packages often conflict with each other, making it impossible to build a conan package on the system. for instance, on Ubuntu, 32-bit python-development package conflicts with 64-bit one.
  • installing additional software from system package manager, may confuse conan build. for instance, libprotobuf-dev might be installed as a dependency of some other package, then it may override protoc executable from the conan package, which is not a desired effect.
  • builds are no longer truly reproducible, because various distros have different versions of system packages, therefore behaviour may change after recompiling on another distro.
  • additional point of failure and additional dependency on the availability of system package manager registry - e.g. apt repositories are known to have downtimes. given the fact conan builds on CI are very unstable for other reason, it would be nice to have to remove that factor.
  • conan builds are no longer self-contained - installation of executables/libraries will not run or link because of dependency of system libraries, which might not be available on another machine.
  • use-case simply doesn't work on systems with no system package managers (e.g. source-only distros, like LFS systems), or systems with poor support for package managers or lack of packages (e.g. Windows).
  • in order to use conan package, now it's needed to use pkg-config to locate system requirements (or locate them by other means, some system requirements don't even have .pc files!) - this adds additional complexity to the integration of conan package into the build system.

right now, the following conan packages are heavily based on system requirements:

OpenGL-related: GLU, GLUT, GLEW, GLFW, GLM, etc.
ffmpeg: X11, vaapi, vdpau
SDL2: X11, wayland, mir, audio APIs (pulseaudio, alsa, esound, nas, jack, etc.)
wxWidgets: GTK

@SSE4
Copy link
Contributor

SSE4 commented Jan 10, 2020

my two cents:

Should we create recipes for system requirements and package binaries?

if it's easy and straightforward - yes, that should be the first choice (as we did with libalsa, for instance)

Should we just check the existence of that tools installed and just raise?

I'd say that's much better than system requirements, and should be the second choice

Should we create "proxy recipes" that perform those install or checks to model that dependency in the conan graph?

such things sometimes are needed, e.g. OpenGL support might be abstracted away in conan (on Windows, you have to link with opengl32.lib, on Mac you have to link with OpenGL.framework, and on Linux you need some system libraries such as libGL.so). this also applies to few other things which are system-specific (e.g. Python, OpenCL, Vulkan, CUDA, etc.). it would be much simpler for other recipes to just include opengl-support conan package and don't worry about system specifics and system requirements. in such case, we delegate all baby-sitting and complexity to single recipe, rather copying ugly fragile conditions to every recipe which requires OpenGL.

@Hopobcn
Copy link
Contributor

Hopobcn commented Jan 10, 2020

I agree with the ''proxy recipes'' concept. Take the case of CUDA, they ship it with an EULA which makes it nearly impossible to be packaged by conan (If I remember well, the only redistribution allowed is via docker image). Also, you don't want to make an installer package because not only it requires sudo and downloads several GBs, but also installs some kernel modules which could be a big NO if the user is not informed previously.
Shoulnt "proxy recipes" require changes in the CCI hooks?

@SSE4
Copy link
Contributor

SSE4 commented Jan 10, 2020

they do require changes in hooks - in practice, such proxy recipe may contain no files at all (no headers or libraries copied), but it only may have package_info method specifying platform-specific includes and libraries. however, currently our hooks complain if no files are copied. but that should be an easy change (or we can simply white-list such packages).

@Croydon
Copy link
Contributor

Croydon commented Jan 11, 2020

@SSE4 pretty much summarized it and I agree: system_requirements should be avoided whenever possible.

In the specific case of libudev @uilianries and I were of the opinion that as part of systemd it is too low-level to be packaged with Conan (maybe not?). And libusb needed it, as it is non-optional, but also not pre-installed in the CCI building workers.

At Bincrafters we slowly create packages for the system_requirements in our recipes. This sometimes takes an awful lot of time, but I don't see any reason to loose patient now just because we have the Conan Center Index.

At first glance it might save a lot of time because you don't need to create all of these packages, but precisely because of all the downsides @SSE4 summarized I believe that the time in supporting users and for adding workarounds for specific issues will cost much more time long term. And quality wise it is much more better to have Conan packages anyway (again; see @SSE4 summary).

That said, I support conan-io/hooks#112 to make sure that nothing slips through review. We can add specific packages to a whitelist if we -absolutely- have to at some point.

@czoido
Copy link
Contributor

czoido commented Jan 16, 2020

In the end, it seems that for the case of X11 (and probably more typical libs that are close to the system) the best would be to have such "proxy" recipe in ConanCenter that is xorg/system (encompassing several X libraries), and simply check that it is installed and inject the flags but not building or packaging from source.
When someone from a Linux desktop requires a package from ConanCenter, like GLFW or the like, that requires the X11 windows system installed, they will certainly want to be using the X11 of their system. Having X11 as real packages from Conan could be conflicting/incompatible with the existing X11 runtime.

@ericLemanissier
Copy link
Contributor

ericLemanissier commented Apr 5, 2020

I started working on wayland : https://github.com/bincrafters/conan-wayland. The package is quite self-contained, and ubuntu distro seems to build it "vanilla". Also, wayland is just the implementation of a protocol (named wayland too...), and I don't know of another implementation.
This makes me think we could simply create a conan-center-package for it, without creating a proxy for the system version. Do you agree ?

@ltjax
Copy link

ltjax commented Apr 6, 2020

I think you definitely want the option of using proxies for system level dependencies like that, regardless of whether it's technically a good idea. The end decision should be up to the application developer/stakeholders.
There's really a few concerns with this. Deployment is only one of them. Sometimes you also do not want to many deep per-project dependency hierarchies in conan and instead rely on a baseline system for compilation. E.g. I might want to link to wayland, but I really don't wanna see that it drags in expat or libffi, much less have those libs generate conflicts further down the line.
Maybe it can be done in a more "conan" way by deploying the proxy recipe for a specific user/channel like system/stable?

@uilianries
Copy link
Member

for the reference, I am copying here problems/disadvantages/limitations caused by system requirements we previously faced on bincrafters:

System Requirements in conan known to cause lots of various issues, they are hard to maintain, and don't scale well. some problems with System Requirements:

* they require running system package manager from sudo, which is not a desired in many cases. they pollute build machine, they require user interaction, and they require permissions which developer might not have at all.

* system requirements don't support cross-building at all, especially cases like cross-compiling from Windows/Mac to Linux are impossible, and even Linux -> Linux is problematic.

* due to the fragmentation, there are many major package managers to support, like yum, apt, pacman, etc. support of each new package manager requires additional baby-sitting (and possibly modifications to the conan itself, to add new distributions like Gentoo, Slackware, etc.)

* within single package manager, there are various architectures to support - and this adds an additional level of scalability problem.

* even within single package manager, various distributions may have different names of packages. for instance, Fedora, RHEL and CentOS are known to have different package names for certain packages.

* even within single distro, packages are deprecated and removed from time to time, so it requires continuous maintenance with every new release.

* the code to use system package manager in recipes looks ugly - it has lots of nested conditions.

* due to the various bugs in distributions, packages often conflict with each other, making it impossible to build a conan package on the system. for instance, on Ubuntu, 32-bit python-development package conflicts with 64-bit one.

* installing additional software from system package manager, may confuse conan build. for instance, libprotobuf-dev might be installed as a dependency of some other package, then it may override protoc executable from the conan package, which is not a desired effect.

* builds are no longer truly reproducible, because various distros have different versions of system packages, therefore behaviour may change after recompiling on another distro.

* additional point of failure and additional dependency on the availability of system package manager registry - e.g. apt repositories are known to have downtimes. given the fact conan builds on CI are very unstable for other reason, it would be nice to have to remove that factor.

* conan builds are no longer self-contained - installation of executables/libraries will not run or link because of dependency of system libraries, which might not be available on another machine.

* use-case simply doesn't work on systems with no system package managers (e.g. source-only distros, like LFS systems), or systems with poor support for package managers or lack of packages (e.g. Windows).

* in order to use conan package, now it's needed to use pkg-config to locate system requirements (or locate them by other means, some system requirements don't even have .pc files!) - this adds additional complexity to the integration of conan package into the build system.

right now, the following conan packages are heavily based on system requirements:

OpenGL-related: GLU, GLUT, GLEW, GLFW, GLM, etc.
ffmpeg: X11, vaapi, vdpau
SDL2: X11, wayland, mir, audio APIs (pulseaudio, alsa, esound, nas, jack, etc.)
wxWidgets: GTK

I will print this and put it on a board

@jgsogo
Copy link
Contributor

jgsogo commented Jul 17, 2020

opengl recipe with a system approach has been out for a while: https://github.com/conan-io/conan-center-index/blob/master/recipes/opengl/all/conanfile.py, no big issues so far and it is been consumed by some other packages.

Should this approach be added to the docs?

@jgsogo jgsogo added the Docs label Jul 17, 2020
@uilianries
Copy link
Member

Yes! I think it's a good reference. I usually send its link when users ask for system requirements.

@jpfeuffer
Copy link

Hi! I have a question regarding this: How can a package specify that it needs the "-dev" system package for building and the non-dev version is enough for installing it as a dependency (i.e., because it is a "private" shared library dependency)?

@uilianries
Copy link
Member

@jpfeuffer All dependencies are always listed, there is a private attribute, but we don't allow in CCI. In case you only need to use a system package when building a package, then it should be considered a tool_requires

@jpfeuffer
Copy link

jpfeuffer commented Oct 18, 2022

Ok, thank you. I am asking because I have the feeling that having Qt as a dependency (even if it exists pre-built on an index) will always try to install all those libx*-dev packages although they are not necessary.
This is a bit annoying because the non-dev packages are most often installed by sysadmins already while the dev packages require extra tickets to be filed etc.

@uilianries
Copy link
Member

@jpfeuffer You can customize Qt package by disabling its options. Of course, you will need to build from source, but you can use a free Artifactory instance and push your Qt package there.

@jpfeuffer
Copy link

jpfeuffer commented Oct 19, 2022

Not sure if we are talking about the same thing. Qt always has xorg as a dependency on Linux. I don't think it can be deactivated. But what I would want is that it only installs xorg instead of xorg-dev on my machine when I set Qt as dependency and it comes prebuilt.
My library does not need the header files that come with the dev packages.
What I am probably suggesting is in your xorg or other system library wrappers, to have the dev packages as "tool_requires" and the non-dev packages as "system_requirements"

@uilianries
Copy link
Member

@jpfeuffer In that case you can't customize without changing the recipe directly and re-building the package. It's not possible removing a dependency by a Conan command, let's say.

@Crucio32000
Copy link

Crucio32000 commented Mar 17, 2024

Hi everyone,
Sorry to resume this old topic, but i'm finding myself struggling when dealing with packages such as opengl/system .

My journey started with the following goal:

  • Generate a toolchain for cross-compilation purposes (target device is aarch64)
  • Toolchain must be based on GCC == 8.X.Z
  • No actual requirements for additional shipped libraries: actually, the less, the better.

So i've started using YoctoProjects Toolchains that does have the compiler and some basic libraries (libdrm, opengl and so on), as i was familiar with it.
A simple recipe downloads it from YoctoProject (i've used 2.4 version, shipping with GCC 8.2.0) and set the build environment accordingly.
Below the profile i've used:

[settings]
arch=armv8
build_type=Release
compiler=gcc
compiler.cppstd=gnu17
compiler.libcxx=libstdc++11
compiler.version=8
os=Linux
[tool_requires]
cmake/[>=3.15]
yocto-toolchain-gcc/8.3.0

I've managed to build libraries like boost effortlessly, but when dealt with Qt, my struggle begun.
I've found out that all the packages with system version, tries to install dependencies using local package manager.
but i found that quite disturbing: depending on the operating system, i may end up with different version of the same package: for instance, using Ubuntu 22.04, i've ended up with a library compiled with GCC 12.

I believe that those */system packages should be replaced with the actual compilation, as Yocto does, for instance for OpenGL to build from Mesa git.

On my trials and attemps, i had to perform the following actions

  • Insert a bunch of [platform_requires] indicating the versions of each library
    • Compilation went succesfully, even with gui module enabled
    • Errors appeared when using the package because opengl::opengl was not found. For this reason i've removed the requirement altogether from the recipe, to make it work,.. and it did.
    • I believe Conan does not manage platform requirements well or, simply, the recipe shall be aware. Not really sure

Here an example of the above point.

[platform_requires]
bzip2/1.0.6
dbus/1.12.10
egl/system
expat/2.2.6
fontconfig/2.12.6
freetype/2.9.1
glib/2.58.0
gst-plugins-base/1.14.4
libudev/system
opengl/system
openssl/1.1.1a
...
xorg/system

I've also tried to use Linaro Toolchain, that ships with just the compiler, so i could get familiar with it as it represents a good candidate, and to challenge myself a little bit.
I've made a test also using this toolchain, and i was able to build Qt recipe with the following modifications:

  • Without GUI module
  • Using opengl = no
  • Removing xkbcommon that adds a great share of packages through package manager, that i wanted to avoid.

Sorry for the lenghty post, but i needed to share my experience and i'm looking forward to get to know what do you think about it.

Any advice would be great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Docs help wanted Need some help to be finish priority: medium question Further information is requested
Projects
None yet
Development

No branches or pull requests