-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Low pT electrons (up to GsfTracks) #25455
Conversation
The code-checks are being triggered in jenkins. |
+code-checks Logs: https://cmssdt.cern.ch/SDT/code-checks/cms-sw-PR-25455/7535 |
A new Pull Request was created by @bainbrid for master. It involves the following packages: Configuration/StandardSequences @perrotta, @cmsbuild, @franzoni, @slava77, @fabiocos, @davidlange6 can you please review it and eventually sign? Thanks. cms-bot commands are listed here |
@cmsbuild please test this will fail at run time due to missing external. |
The tests are being triggered in jenkins. |
LowPtGsfElectronSeedProducer::LowPtGsfElectronSeedProducer( const edm::ParameterSet& conf, | ||
const lowptgsfeleseed::HeavyObjectCache* ) : | ||
kfTracks_{consumes<reco::TrackCollection>(conf.getParameter<edm::InputTag>("tracks"))}, | ||
pfTracks_{consumes<reco::PFRecTrackCollection>(conf.getParameter<edm::InputTag>("pfTracks"))}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be better to consume "tracks"
only if usePfTracks_ == false
(and "pfTracks"
if it is true
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, here
|
||
// PF tracks | ||
edm::Handle<reco::PFRecTrackCollection> pfTracks; | ||
event.getByToken(pfTracks_, pfTracks); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here, it would better to get
only the collection that is needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, here
|
||
void produce( edm::Event&, const edm::EventSetup& ) override; | ||
|
||
static void fillDescription( edm::ParameterSetDescription& ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The proper signature is
static void fillDescription( edm::ParameterSetDescription& ); | |
static void fillDescriptions( edm::ConfigurationDescriptions& ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, here
-1 Tested at: a556a68 You can see the results of the tests here: I found follow errors while testing this PR Failed tests: RelVals AddOn
When I ran the RelVals I found an error in the following workflows: runTheMatrix-results/5.1_TTbar+TTbarFS+HARVESTFS/step1_TTbar+TTbarFS+HARVESTFS.log4.53 step3 runTheMatrix-results/4.53_RunPhoton2012B+RunPhoton2012B+HLTD+RECODR1reHLT+HARVESTDR1reHLT/step3_RunPhoton2012B+RunPhoton2012B+HLTD+RECODR1reHLT+HARVESTDR1reHLT.log135.4 step1 runTheMatrix-results/135.4_ZEE_13+ZEEFS_13+HARVESTUP15FS+MINIAODMCUP15FS/step1_ZEE_13+ZEEFS_13+HARVESTUP15FS+MINIAODMCUP15FS.log9.0 step3 runTheMatrix-results/9.0_Higgs200ChargedTaus+Higgs200ChargedTaus+DIGI+RECO+HARVEST/step3_Higgs200ChargedTaus+Higgs200ChargedTaus+DIGI+RECO+HARVEST.log136.731 step3 runTheMatrix-results/136.731_RunSinglePh2016B+RunSinglePh2016B+HLTDR2_2016+RECODR2_2016reHLT_skimSinglePh_HIPM+HARVESTDR2/step3_RunSinglePh2016B+RunSinglePh2016B+HLTDR2_2016+RECODR2_2016reHLT_skimSinglePh_HIPM+HARVESTDR2.log136.788 step3 runTheMatrix-results/136.788_RunSinglePh2017B+RunSinglePh2017B+HLTDR2_2017+RECODR2_2017reHLT_skimSinglePh_Prompt+HARVEST2017/step3_RunSinglePh2017B+RunSinglePh2017B+HLTDR2_2017+RECODR2_2017reHLT_skimSinglePh_Prompt+HARVEST2017.log25.0 step3 runTheMatrix-results/25.0_TTbar+TTbar+DIGI+RECOAlCaCalo+HARVEST+ALCATT/step3_TTbar+TTbar+DIGI+RECOAlCaCalo+HARVEST+ALCATT.log140.56 step2 runTheMatrix-results/140.56_RunHI2018+RunHI2018+RECOHID18+HARVESTDHI18/step2_RunHI2018+RunHI2018+RECOHID18+HARVESTDHI18.log1306.0 step3 runTheMatrix-results/1306.0_SingleMuPt1_UP15+SingleMuPt1_UP15+DIGIUP15+RECOUP15+HARVESTUP15/step3_SingleMuPt1_UP15+SingleMuPt1_UP15+DIGIUP15+RECOUP15+HARVESTUP15.log136.85 step3 runTheMatrix-results/136.85_RunEGamma2018A+RunEGamma2018A+HLTDR2_2018+RECODR2_2018reHLT_skimEGamma_Prompt_L1TEgDQM+HARVEST2018_L1TEgDQM/step3_RunEGamma2018A+RunEGamma2018A+HLTDR2_2018+RECODR2_2018reHLT_skimEGamma_Prompt_L1TEgDQM+HARVEST2018_L1TEgDQM.log158.0 step3 runTheMatrix-results/158.0_HydjetQ_B12_5020GeV_2018_ppReco+HydjetQ_B12_5020GeV_2018_ppReco+DIGIHI2018PPRECO+RECOHI2018PPRECO+ALCARECOHI2018PPRECO+HARVESTHI2018PPRECO/step3_HydjetQ_B12_5020GeV_2018_ppReco+HydjetQ_B12_5020GeV_2018_ppReco+DIGIHI2018PPRECO+RECOHI2018PPRECO+ALCARECOHI2018PPRECO+HARVESTHI2018PPRECO.log1000.0 step2 runTheMatrix-results/1000.0_RunMinBias2011A+RunMinBias2011A+TIER0+SKIMD+HARVESTDfst2+ALCASPLIT/step2_RunMinBias2011A+RunMinBias2011A+TIER0+SKIMD+HARVESTDfst2+ALCASPLIT.log1001.0 step2 runTheMatrix-results/1001.0_RunMinBias2011A+RunMinBias2011A+TIER0EXP+ALCAEXP+ALCAHARVDSIPIXELCALRUN1+ALCAHARVD1+ALCAHARVD2+ALCAHARVD3+ALCAHARVD4+ALCAHARVD5/step2_RunMinBias2011A+RunMinBias2011A+TIER0EXP+ALCAEXP+ALCAHARVDSIPIXELCALRUN1+ALCAHARVD1+ALCAHARVD2+ALCAHARVD3+ALCAHARVD4+ALCAHARVD5.log1330.0 step3 runTheMatrix-results/1330.0_ZMM_13+ZMM_13+DIGIUP15+RECOUP15_L1TMuDQM+HARVESTUP15_L1TMuDQM/step3_ZMM_13+ZMM_13+DIGIUP15+RECOUP15_L1TMuDQM+HARVESTUP15_L1TMuDQM.log10042.0 step3 runTheMatrix-results/10042.0_ZMM_13+ZMM_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017/step3_ZMM_13+ZMM_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017.log10024.0 step3 runTheMatrix-results/10024.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017.log25202.0 step3 runTheMatrix-results/25202.0_TTbar_13+TTbar_13+DIGIUP15_PU25+RECOUP15_PU25+HARVESTUP15_PU25/step3_TTbar_13+TTbar_13+DIGIUP15_PU25+RECOUP15_PU25+HARVESTUP15_PU25.log11624.0 step3 runTheMatrix-results/11624.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2019_GenSimFull+DigiFull_2019+RecoFull_2019+HARVESTFull_2019+ALCAFull_2019/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2019_GenSimFull+DigiFull_2019+RecoFull_2019+HARVESTFull_2019+ALCAFull_2019.log10824.0 step3 runTheMatrix-results/10824.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2018_GenSimFull+DigiFull_2018+RecoFull_2018+HARVESTFull_2018+ALCAFull_2018+NanoFull_2018/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2018_GenSimFull+DigiFull_2018+RecoFull_2018+HARVESTFull_2018+ALCAFull_2018+NanoFull_2018.log10224.0 step3 runTheMatrix-results/10224.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017PU_GenSimFull+DigiFullPU_2017PU+RecoFullPU_2017PU+HARVESTFullPU_2017PU/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017PU_GenSimFull+DigiFullPU_2017PU+RecoFullPU_2017PU+HARVESTFullPU_2017PU.log20034.0 step3 runTheMatrix-results/20034.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D17_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D17+RecoFullGlobal_2023D17+HARVESTFullGlobal_2023D17/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D17_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D17+RecoFullGlobal_2023D17+HARVESTFullGlobal_2023D17.log21234.0 step3 runTheMatrix-results/21234.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D21_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D21+RecoFullGlobal_2023D21+HARVESTFullGlobal_2023D21/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D21_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D21+RecoFullGlobal_2023D21+HARVESTFullGlobal_2023D21.log27434.0 step3 runTheMatrix-results/27434.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D35_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D35+RecoFullGlobal_2023D35+HARVESTFullGlobal_2023D35/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D35_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D35+RecoFullGlobal_2023D35+HARVESTFullGlobal_2023D35.log250202.181 step4 runTheMatrix-results/250202.181_TTbar_13UP18+TTbar_13UP18+PREMIXUP18_PU25+DIGIPRMXLOCALUP18_PU25+RECOPRMXUP18_PU25+HARVESTUP18_PU25/step4_TTbar_13UP18+TTbar_13UP18+PREMIXUP18_PU25+DIGIPRMXLOCALUP18_PU25+RECOPRMXUP18_PU25+HARVESTUP18_PU25.log
I found errors in the following addon tests: cmsDriver.py TTbar_8TeV_TuneCUETP8M1_cfi --conditions auto:run1_mc --fast -n 100 --eventcontent AODSIM,DQM --relval 100000,1000 -s GEN,SIM,RECOBEFMIX,DIGI:pdigi_valid,L1,DIGI2RAW,L1Reco,RECO,EI,VALIDATION --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --datatier GEN-SIM-DIGI-RECO,DQMIO --beamspot Realistic8TeVCollision : FAILED - time: date Sat Dec 8 01:20:31 2018-date Sat Dec 8 01:19:40 2018 s - exit: 21248 |
Comparison not run due to runTheMatrix errors (RelVals and Igprof tests were also skipped) |
Dear @slava77, I see that some of the "Matrix" and "AddOn" tests failed. By inspecting a few random log files, it seems to be related to the missing external"
Persumably I should ignore these for now? |
right, the run time failures were expected in this pass. |
please test with cms-sw/cmsdist#4569 |
Hi @perrotta @slava77 @fabiocos, First, I'm testing the timing requirements as requested by @perrotta - results to come. Second, here what I did for a rebase test. (I'm not a git expert but I think the following is sensible):
The following command yields no differences (i.e. empty output to stdout):
And this command:
yields the following (expected) differences:
If you are happy with this, how do I proceed? Do I force push to the Also, how will this affect the other PRs (from @gkaratha and @nancymarinelli) that depend on this branch? Finally, should I also rebase onto CMSSW_10_2_X for the required back port to CMSSW_10_2? If so, then do I open up a new PR with a |
bainbrid <notifications@github.com> ha scritto:
If you are happy with this, how do I proceed? Do I force push to the
```bainbrid:cmssw/LowPtElectrons_10X``` branch?
As I said, I would avoid the extra complication of rebasing if not
needed. But I let @fabiocos answer here
Also, how will this affect the other PRs (from @gkaratha and
@nancymarinelli) that depend on this branch?
Those two PRs do not depend on your own, besides the fact that they
both need to be run together.
Thusm no effect is expected on them
Finally, should I also rebase onto CMSSW_10_2_X for the required
back port to CMSSW_10_2? If so, then do I open up a new PR with a
```LowPtElectrons_102X``` branch?
You will open a new PR for 10_2, in a different branch.
… --
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#25455 (comment)
|
@bainbrid we have a cms-rebase-topic command documented in https://cms-sw.github.io/tutorial-resolve-conflicts.html with some options to make life simpler knowing the old base. But as I said it is not a must, if @perrotta is concerned about possible drawbacks let's move forward as it is |
I'm attempting to rebase onto 10_2_X.
and I see this output:
My changes do not touch anything in DataFormats, so why do I see this message and how should I react? |
Hi @perrotta here are the latest timings from me, which appear to be consistent with your values.
@slava77 @perrotta @fabiocos I have rebased the |
Thank you @bainbrid ! The two rebased branches looks correct: you can submit PRs with them. About your timing results, could you please tell us what has changed since your previous ones?
|
Do I make the 102X PR to cms-sw/master or cms-sw/CMSSW_10_2_X? I've already made the 105X PR to cms-sw/master. I hope this is correct? |
bainbrid <notifications@github.com> ha scritto:
@perrotta
Do I make the 102X PR to cms-sw/master or cms-sw/CMSSW_10_2_X?
CMSSW_10_2_X
…
I've already made the 105X PR to cms-sw/master. I hope this is correct?
--
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#25455 (comment)
|
@bainbrid please answer the question in #25455 (comment), i.e.: "About your timing results, could you please tell us what has changed since your previous ones?
This PR shouldn't be delayed any further if we want it and its descendents in 10_5_0_pre1 |
Hi @perrotta, The efficiency studies presented here in a RECO/AT were performed prior to, and independently of, the timing studies. These studies defined the WPs that have been subsequently used for e.g. the timing studies. The acceptance studies presented here depend solely on the pT threshold. Concerning my old set of timing studies, they were performed with 100 events, but the _cfg is certainly different from the one I used yesterday to determine the latest set of timings. Perhaps I screwed up something with the old set, but it's not easy for me to say exactly what. What I can say is that the most recent set were based on a fresh working area and our numbers now agree. I have no reason to believe that there is any inconsistency between the efficiencies and timing studies. Finally, I note that the BDT models in cms-data will be replaced soon with new models trained on 10_2 MC samples. We expect the efficiencies to change, while the timing will remain the ~same. We can repeat these checks if necessary, but we hope this doesn't prevent/delay integration into the release. |
Thank you @bainbrid My concern was, of course, that physics performance studies were performed on a different setup and working point that the one that will be merged in the release. My suggestion is to merge this PR as it is now, but to eventually redo the physics performance studies with exactly what is in this PR, just to verify that nothing got screwed. The arrival of the new BDT models could be an appropriate time for making those further studies in a rather similar model. Moreover, it could be interesting to dig for the difference in the configuration that you used for your timing studies: if the timing improvement was due to some optimization not costly for the physics performance it may be worth checking whether it can be implemented. Also this can happen after the integration of this PR in the release. |
(removed the "+1" while waiting verification of the usage of reducedEcalRecHits)
|
-1 |
For the record, at the end of this thread:
|
This pull request concerns very low pT electron reconstruction, which is intended for use with the B Parking project. It contains a cms.Sequence configured to produce GsfTracks.
This is done with improved performance in the region 0.5 < pT < ~10 GeV with respect to the nominal EGamma chain.
Existing modules are cloned, a new EDProducer is provided, and new collections are made using existing DataFormats, to provide a fully standalone chain that does not touch the existing EGamma reconstruction.
The new LowPtGsfElectronSeedProducer relies on the output of two BDTs, which are provided via this pull request:
cms-data/RecoEgamma-ElectronIdentification#10
An additional pull request will be made on a short time scale that complete the electron reconstruction chain (adding SuperClusters, GsfElectronCores, GsfElectron, and ID).
Potential further PRs may be made that define workflows for 1) a skim of the B parked data set, enriched in B->K(*)ll events, and 2) a data control sample of photon conversions to soft electrons.
@mverzett , @Sam-Harper , @nancymarinelli , and @gkaratha are interested to follow this thread.
Cheers,
rob