Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CMSSW_14_0_12_MULTIARCHS is not based on CMSSW_14_0_12 #45596

Closed
missirol opened this issue Jul 30, 2024 · 33 comments
Closed

CMSSW_14_0_12_MULTIARCHS is not based on CMSSW_14_0_12 #45596

missirol opened this issue Jul 30, 2024 · 33 comments

Comments

@missirol
Copy link
Contributor

x86-64-v3 aside, I think CMSSW_14_0_12_MULTIARCHS corresponds to CMSSW_14_0_11; in one command

git diff CMSSW_14_0_11 CMSSW_14_0_12_MULTIARCHS
[empty diff]

This is maybe related to the fact that #45504 (comment) mentions CMSSW_14_0_11 (as opposed to CMSSW_14_0_12).

Attn: @cms-sw/orp-l2 @cms-sw/hlt-l2

@cmsbuild
Copy link
Contributor

cmsbuild commented Jul 30, 2024

cms-bot internal usage

@cmsbuild
Copy link
Contributor

A new Issue was created by @missirol.

@Dr15Jones, @antoniovilela, @makortel, @mandrenguyen, @rappoccio, @sextonkennedy, @smuzaffar can you please review it and eventually sign/assign? Thanks.

cms-bot commands are listed here

@makortel
Copy link
Contributor

assign orp, core

@cmsbuild
Copy link
Contributor

New categories assigned: orp,core

@Dr15Jones,@makortel,@mandrenguyen,@rappoccio,@antoniovilela,@smuzaffar,@sextonkennedy you have been requested to review this Pull request/Issue and eventually sign? Thanks

@mmusich
Copy link
Contributor

mmusich commented Jul 30, 2024

indeed release notes are empty: https://github.com/cms-sw/cmssw/releases/tag/CMSSW_14_0_12_MULTIARCHS

@mmusich
Copy link
Contributor

mmusich commented Jul 30, 2024

what's the best course of action? I think this calls for an urgent CMSSW_14_0_13(_MULTIARCHS) build.

@mmusich
Copy link
Contributor

mmusich commented Jul 30, 2024

taggging @anpicci as current ORM.

@antoniovilela
Copy link
Contributor

The build command should have pointed to CMSSW_14_0_12.

@smuzaffar
Shahzad, can we rebuild with a different name pointing to the proper commit?

I will look into 14_0_13 in a while (on the phone right now).

@smuzaffar
Copy link
Contributor

As #45504 explicitly requested to use CMSSW_14_0_11 tag that is why CMSSW_14_0_12_MULTIARCHS is identical to CMSSW_14_0_11. @antoniovilela , as release has been built/uploaded and deployed so it is not eacy to rebuild it. I would suggest to build CMSSW_14_0_13 and CMSSW_14_0_13_MULTIARCHS

@antoniovilela
Copy link
Contributor

antoniovilela commented Jul 30, 2024

As #45504 explicitly requested to use CMSSW_14_0_11 tag that is why CMSSW_14_0_12_MULTIARCHS is identical to CMSSW_14_0_11. @antoniovilela , as release has been built/uploaded and deployed so it is not eacy to rebuild it. I would suggest to build CMSSW_14_0_13 and CMSSW_14_0_13_MULTIARCHS

Ok, will start building CMSSW_14_0_13(_MULTIARCHS) now with what is ready, including the PR we have discussed in the ORP meeting.

Would ideally like to check a 14_0_X IB before uploading. Could you maybe start one, a patch build, once you see this message?

@antoniovilela
Copy link
Contributor

Build issues:
#45599
#45600

@anpicci
Copy link

anpicci commented Jul 31, 2024

@antoniovilela the solution proposed is fine to me, I have sent an email to all the interested people to clarify the plan. Thank you

@antoniovilela
Copy link
Contributor

As #45504 explicitly requested to use CMSSW_14_0_11 tag that is why CMSSW_14_0_12_MULTIARCHS is identical to CMSSW_14_0_11. @antoniovilela , as release has been built/uploaded and deployed so it is not eacy to rebuild it. I would suggest to build CMSSW_14_0_13 and CMSSW_14_0_13_MULTIARCHS

Ok, will start building CMSSW_14_0_13(_MULTIARCHS) now with what is ready, including the PR we have discussed in the ORP meeting.

Would ideally like to check a 14_0_X IB before uploading. Could you maybe start one, a patch build, once you see this message?

Let's proceed with the upload and we check the IB a posteriori.

@mmusich
Copy link
Contributor

mmusich commented Jul 31, 2024

explicitly requested to use CMSSW_14_0_11 tag that is why CMSSW_14_0_12_MULTIARCHS is identical to CMSSW_14_0_11.

@smuzaffar @antoniovilela since this in general can't be the thing intended, can a check be implemented on the core side to avoid triggering the *_MULTIARCHS build if the target commit doesn't match?
(We'll likely have to protect on the TSG ops side as well, but this is really a failure mode I couldn't think of).

@antoniovilela
Copy link
Contributor

explicitly requested to use CMSSW_14_0_11 tag that is why CMSSW_14_0_12_MULTIARCHS is identical to CMSSW_14_0_11.

@smuzaffar @antoniovilela since this in general can't be the thing intended, can a check be implemented on the core side to avoid triggering the *_MULTARCHS build if the target commit doesn't match? (We'll likely have to protect on the TSG ops side as well, but this is really a failure mode I couldn't think of).

Hi Marco,
I would not expect the explicit MULTIARCHS build to be needed beyond 14_0_X, since I believe the instruction set will become default.
Shahzad can comment if the build script could check in the case of a MULTIARCHS build and force the commit, or check for a null diff, but given the above statement I would not think it is worth the effort.

@mmusich
Copy link
Contributor

mmusich commented Jul 31, 2024

but given the above statement I would not think it is worth the effort.

with several months of pp production ahead, seems a bold statement. Let's hope for the best.

@antoniovilela
Copy link
Contributor

but given the above statement I would not think it is worth the effort.

with several months of pp production ahead, seems a bald statement. Let's hope for the best.

Shahzad can comment if it is feasible from their side.

When building such a release, I have always checked that the correct commit has been picked up, and that the contents of the tag are the same to the sister release. It is a manual check however.

@makortel
Copy link
Contributor

We had discussed in past core software meetings to change the behavior of default 14_0_X to be the same as in 14_1_X in this regard, i.e. i.e. the default 14_0_X build would contain binaries for both sse3 and x86_64-v3, and by default the sse3 would be used (whereas 14_0_X_MULTIARCHS picks the x86_64-v3 if the system supports it). The DAQ could then explicitly choose the use of x86_64-v3 binaries with USER_SCRAM_TARGET=x86-64-v3.

With this approach (that DAQ needs to anyhow move to for 14_1_X), the 14_0_X_MULTIARCHS build would not be needed anymore.

@mmusich
Copy link
Contributor

mmusich commented Jul 31, 2024

We had discussed in past core software meetings to change the behavior of default 14_0_X to be the same as in 14_1_X

If core approves of this, would certainly make life easier. What's the timeline for that?

@makortel
Copy link
Contributor

makortel commented Jul 31, 2024

We had discussed in past core software meetings to change the behavior of default 14_0_X to be the same as in 14_1_X

If core approves of this, would certainly make life easier. What's the timeline for that?

I believe from core side the change could be done rather quickly (@smuzaffar please confirm), but we'd need to coordinate with DAQ (@cms-sw/daq-l2) when (which CMSSW release) to do it because it would require a change on their side.

@fwyzard
Copy link
Contributor

fwyzard commented Jul 31, 2024

Currently (as of CMSSW_14_1_0_pre6) one needs to do

scram b enable-multi-targets
export USER_SCRAM_TARGET=auto  # or x86-64-v3
cmsenv

to enable x86-64-v3 where supported.

Can we make auto the default when enable-multi-targets is used ?
Does anybody see any possible problems with it ?

@smuzaffar
Copy link
Contributor

Can we make auto the default when enable-multi-targets is used ?

sure, I will update the configuration so that enable-multi-targets also set SCRAM_TARGET=auto

@smuzaffar
Copy link
Contributor

@fwyzard , cms-sw/cmsdist#9337 has been merged for CMSSW_14_1_X. scram b enable-multi-targets should now also set SCRAM_TARGET=auto in the config/Self.xml. Can you please try it in CMSSW_14_1_X IBs and see if it works as expected ? I can then merge cms-sw/cmsdist#9338 for 14.0.X

@fwyzard
Copy link
Contributor

fwyzard commented Aug 4, 2024

@smuzaffar I can confirm that with CMSSW_14_1_X_2024-08-04-0000, doing scram b enable-multi-targets also sets the target to auto, and picks up x86-64-v3 if it's available:

$ cmsrel CMSSW_14_1_X_2024-08-04-0000

$ cd CMSSW_14_1_X_2024-08-04-0000

$ scram b enable-multi-targets
Building with multi-targets is enabled.
SCRAM TARGET set to auto

$ cmsenv
IMPORTANT: Setting CMSSW environment to use 'x86-64-v3' target.

$ echo $LD_LIBRARY_PATH | tr : \\n
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/biglib/el8_amd64_gcc12/scram_x86-64-v3
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/biglib/el8_amd64_gcc12
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/lib/el8_amd64_gcc12/scram_x86-64-v3
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/lib/el8_amd64_gcc12
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/external/el8_amd64_gcc12/lib/scram_x86-64-v3
/tmp/fwyzard/CMSSW_14_1_X_2024-08-04-0000/external/el8_amd64_gcc12/lib
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/biglib/el8_amd64_gcc12/scram_x86-64-v3
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/biglib/el8_amd64_gcc12
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/lib/el8_amd64_gcc12/scram_x86-64-v3
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/lib/el8_amd64_gcc12
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/external/el8_amd64_gcc12/lib/scram_x86-64-v3
/cvmfs/cms-ib.cern.ch/sw/x86_64/week1/el8_amd64_gcc12/cms/cmssw/CMSSW_14_1_X_2024-08-04-0000/external/el8_amd64_gcc12/lib
/cvmfs/cms-ib.cern.ch/sw/x86_64/nweek-02849/el8_amd64_gcc12/external/llvm/17.0.3-ba4f8d1359ce0633961586a6141e92fe/lib64
/cvmfs/cms-ib.cern.ch/sw/x86_64/nweek-02849/el8_amd64_gcc12/external/gcc/12.3.1-40d504be6370b5a30e3947a6e575ca28/lib64
/cvmfs/cms-ib.cern.ch/sw/x86_64/nweek-02849/el8_amd64_gcc12/external/gcc/12.3.1-40d504be6370b5a30e3947a6e575ca28/lib
...

@makortel
Copy link
Contributor

makortel commented Aug 6, 2024

Moving the discussion on the deployment of the 14_1_X-style "multi-architecture build" in 14_0_X into a separate issue #45654 (as requested at ORP)

@makortel
Copy link
Contributor

#45654 has been addressed, so I wonder if we could close this issue?

Or should we think of adding a check that a build request for CMSSW_X_Y_Z_SOMETHING corresponds to CMSSW_X_Y_Z? @smuzaffar

@smuzaffar
Copy link
Contributor

smuzaffar commented Sep 26, 2024

@makortel , in many cases CMSSW_X_Y_Z_SOMETHING does not base on CMSSW_X_Y_Z there are many special cases where different cmssw branch is used to build CMSSW_X_Y_Z_SOMETHING e.g 13.0 HLT and HeavyIon branch, 12.4 HLT branch and some times special root builds. So I think adding a check to make sure CMSSW_X_Y_Z_SOMETHING corresponds to CMSSW_X_Y_Z will not work.

If needed then we can enforce core or externals to sign the release request ( just to have few extra pair of eyes ) but this can sometime cause delays to some urgent releae

@makortel
Copy link
Contributor

Thanks @smuzaffar, then I think we are done (at least from the core side).

@makortel
Copy link
Contributor

+core

@cmsbuild
Copy link
Contributor

This issue is fully signed and ready to be closed.

@makortel
Copy link
Contributor

(going off topic)

This issue is fully signed and ready to be closed.

Apparently orp-pending label doesn't prevent the fully-signed status on issues. Should it? I mean, I understand we go to fully-signed status on PRs when the PR is still orp-pending (so that orp can then do the final sign), but do we need (or want) the same behavior on issues?

@mandrenguyen
Copy link
Contributor

+1

@mandrenguyen
Copy link
Contributor

@makortel I think it's a pretty academic issue. orp-pending is such a rare label that I don't routinely check for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants