Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low pT electrons (up to GsfTracks) #25455

Closed
wants to merge 19 commits into from

Conversation

bainbrid
Copy link
Contributor

@bainbrid bainbrid commented Dec 7, 2018

This pull request concerns very low pT electron reconstruction, which is intended for use with the B Parking project. It contains a cms.Sequence configured to produce GsfTracks.

This is done with improved performance in the region 0.5 < pT < ~10 GeV with respect to the nominal EGamma chain.

Existing modules are cloned, a new EDProducer is provided, and new collections are made using existing DataFormats, to provide a fully standalone chain that does not touch the existing EGamma reconstruction.

The new LowPtGsfElectronSeedProducer relies on the output of two BDTs, which are provided via this pull request:
cms-data/RecoEgamma-ElectronIdentification#10

An additional pull request will be made on a short time scale that complete the electron reconstruction chain (adding SuperClusters, GsfElectronCores, GsfElectron, and ID).

Potential further PRs may be made that define workflows for 1) a skim of the B parked data set, enriched in B->K(*)ll events, and 2) a data control sample of photon conversions to soft electrons.

@mverzett , @Sam-Harper , @nancymarinelli , and @gkaratha are interested to follow this thread.

Cheers,
rob

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 7, 2018

The code-checks are being triggered in jenkins.

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 7, 2018

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 7, 2018

A new Pull Request was created by @bainbrid for master.

It involves the following packages:

Configuration/StandardSequences
RecoEgamma/Configuration
RecoEgamma/EgammaElectronProducers

@perrotta, @cmsbuild, @franzoni, @slava77, @fabiocos, @davidlange6 can you please review it and eventually sign? Thanks.
@varuns23, @Sam-Harper, @makortel, @felicepantaleo, @jainshilpi, @GiacomoSguazzoni, @rovere, @lgray, @Martin-Grunewald, @VinInn, @dgulhan this is something you requested to watch as well.
@davidlange6, @slava77, @fabiocos you are the release manager for this.

cms-bot commands are listed here

@slava77
Copy link
Contributor

slava77 commented Dec 7, 2018

@cmsbuild please test

this will fail at run time due to missing external.
The goal is to get all relevant compile/build time checks to be done (a large fraction of them is not run in a build with an external).

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 7, 2018

The tests are being triggered in jenkins.
https://cmssdt.cern.ch/jenkins/job/ib-any-integration/32076/console Started: 2018/12/08 00:09

LowPtGsfElectronSeedProducer::LowPtGsfElectronSeedProducer( const edm::ParameterSet& conf,
const lowptgsfeleseed::HeavyObjectCache* ) :
kfTracks_{consumes<reco::TrackCollection>(conf.getParameter<edm::InputTag>("tracks"))},
pfTracks_{consumes<reco::PFRecTrackCollection>(conf.getParameter<edm::InputTag>("pfTracks"))},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be better to consume "tracks" only if usePfTracks_ == false (and "pfTracks" if it is true).

Copy link
Contributor Author

@bainbrid bainbrid Dec 10, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, here


// PF tracks
edm::Handle<reco::PFRecTrackCollection> pfTracks;
event.getByToken(pfTracks_, pfTracks);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, it would better to get only the collection that is needed.

Copy link
Contributor Author

@bainbrid bainbrid Dec 10, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, here


void produce( edm::Event&, const edm::EventSetup& ) override;

static void fillDescription( edm::ParameterSetDescription& );
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The proper signature is

Suggested change
static void fillDescription( edm::ParameterSetDescription& );
static void fillDescriptions( edm::ConfigurationDescriptions& );

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, here

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 8, 2018

-1

Tested at: a556a68

You can see the results of the tests here:
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-25455/32076/summary.html

I found follow errors while testing this PR

Failed tests: RelVals AddOn

  • RelVals:

When I ran the RelVals I found an error in the following workflows:
5.1 step1

runTheMatrix-results/5.1_TTbar+TTbarFS+HARVESTFS/step1_TTbar+TTbarFS+HARVESTFS.log

4.53 step3
runTheMatrix-results/4.53_RunPhoton2012B+RunPhoton2012B+HLTD+RECODR1reHLT+HARVESTDR1reHLT/step3_RunPhoton2012B+RunPhoton2012B+HLTD+RECODR1reHLT+HARVESTDR1reHLT.log

135.4 step1
runTheMatrix-results/135.4_ZEE_13+ZEEFS_13+HARVESTUP15FS+MINIAODMCUP15FS/step1_ZEE_13+ZEEFS_13+HARVESTUP15FS+MINIAODMCUP15FS.log

9.0 step3
runTheMatrix-results/9.0_Higgs200ChargedTaus+Higgs200ChargedTaus+DIGI+RECO+HARVEST/step3_Higgs200ChargedTaus+Higgs200ChargedTaus+DIGI+RECO+HARVEST.log

136.731 step3
runTheMatrix-results/136.731_RunSinglePh2016B+RunSinglePh2016B+HLTDR2_2016+RECODR2_2016reHLT_skimSinglePh_HIPM+HARVESTDR2/step3_RunSinglePh2016B+RunSinglePh2016B+HLTDR2_2016+RECODR2_2016reHLT_skimSinglePh_HIPM+HARVESTDR2.log

136.788 step3
runTheMatrix-results/136.788_RunSinglePh2017B+RunSinglePh2017B+HLTDR2_2017+RECODR2_2017reHLT_skimSinglePh_Prompt+HARVEST2017/step3_RunSinglePh2017B+RunSinglePh2017B+HLTDR2_2017+RECODR2_2017reHLT_skimSinglePh_Prompt+HARVEST2017.log

25.0 step3
runTheMatrix-results/25.0_TTbar+TTbar+DIGI+RECOAlCaCalo+HARVEST+ALCATT/step3_TTbar+TTbar+DIGI+RECOAlCaCalo+HARVEST+ALCATT.log

140.56 step2
runTheMatrix-results/140.56_RunHI2018+RunHI2018+RECOHID18+HARVESTDHI18/step2_RunHI2018+RunHI2018+RECOHID18+HARVESTDHI18.log

1306.0 step3
runTheMatrix-results/1306.0_SingleMuPt1_UP15+SingleMuPt1_UP15+DIGIUP15+RECOUP15+HARVESTUP15/step3_SingleMuPt1_UP15+SingleMuPt1_UP15+DIGIUP15+RECOUP15+HARVESTUP15.log

136.85 step3
runTheMatrix-results/136.85_RunEGamma2018A+RunEGamma2018A+HLTDR2_2018+RECODR2_2018reHLT_skimEGamma_Prompt_L1TEgDQM+HARVEST2018_L1TEgDQM/step3_RunEGamma2018A+RunEGamma2018A+HLTDR2_2018+RECODR2_2018reHLT_skimEGamma_Prompt_L1TEgDQM+HARVEST2018_L1TEgDQM.log

158.0 step3
runTheMatrix-results/158.0_HydjetQ_B12_5020GeV_2018_ppReco+HydjetQ_B12_5020GeV_2018_ppReco+DIGIHI2018PPRECO+RECOHI2018PPRECO+ALCARECOHI2018PPRECO+HARVESTHI2018PPRECO/step3_HydjetQ_B12_5020GeV_2018_ppReco+HydjetQ_B12_5020GeV_2018_ppReco+DIGIHI2018PPRECO+RECOHI2018PPRECO+ALCARECOHI2018PPRECO+HARVESTHI2018PPRECO.log

1000.0 step2
runTheMatrix-results/1000.0_RunMinBias2011A+RunMinBias2011A+TIER0+SKIMD+HARVESTDfst2+ALCASPLIT/step2_RunMinBias2011A+RunMinBias2011A+TIER0+SKIMD+HARVESTDfst2+ALCASPLIT.log

1001.0 step2
runTheMatrix-results/1001.0_RunMinBias2011A+RunMinBias2011A+TIER0EXP+ALCAEXP+ALCAHARVDSIPIXELCALRUN1+ALCAHARVD1+ALCAHARVD2+ALCAHARVD3+ALCAHARVD4+ALCAHARVD5/step2_RunMinBias2011A+RunMinBias2011A+TIER0EXP+ALCAEXP+ALCAHARVDSIPIXELCALRUN1+ALCAHARVD1+ALCAHARVD2+ALCAHARVD3+ALCAHARVD4+ALCAHARVD5.log

1330.0 step3
runTheMatrix-results/1330.0_ZMM_13+ZMM_13+DIGIUP15+RECOUP15_L1TMuDQM+HARVESTUP15_L1TMuDQM/step3_ZMM_13+ZMM_13+DIGIUP15+RECOUP15_L1TMuDQM+HARVESTUP15_L1TMuDQM.log

10042.0 step3
runTheMatrix-results/10042.0_ZMM_13+ZMM_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017/step3_ZMM_13+ZMM_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017.log

10024.0 step3
runTheMatrix-results/10024.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017_GenSimFull+DigiFull_2017+RecoFull_2017+HARVESTFull_2017+ALCAFull_2017.log

25202.0 step3
runTheMatrix-results/25202.0_TTbar_13+TTbar_13+DIGIUP15_PU25+RECOUP15_PU25+HARVESTUP15_PU25/step3_TTbar_13+TTbar_13+DIGIUP15_PU25+RECOUP15_PU25+HARVESTUP15_PU25.log

11624.0 step3
runTheMatrix-results/11624.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2019_GenSimFull+DigiFull_2019+RecoFull_2019+HARVESTFull_2019+ALCAFull_2019/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2019_GenSimFull+DigiFull_2019+RecoFull_2019+HARVESTFull_2019+ALCAFull_2019.log

10824.0 step3
runTheMatrix-results/10824.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2018_GenSimFull+DigiFull_2018+RecoFull_2018+HARVESTFull_2018+ALCAFull_2018+NanoFull_2018/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2018_GenSimFull+DigiFull_2018+RecoFull_2018+HARVESTFull_2018+ALCAFull_2018+NanoFull_2018.log

10224.0 step3
runTheMatrix-results/10224.0_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017PU_GenSimFull+DigiFullPU_2017PU+RecoFullPU_2017PU+HARVESTFullPU_2017PU/step3_TTbar_13+TTbar_13TeV_TuneCUETP8M1_2017PU_GenSimFull+DigiFullPU_2017PU+RecoFullPU_2017PU+HARVESTFullPU_2017PU.log

20034.0 step3
runTheMatrix-results/20034.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D17_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D17+RecoFullGlobal_2023D17+HARVESTFullGlobal_2023D17/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D17_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D17+RecoFullGlobal_2023D17+HARVESTFullGlobal_2023D17.log

21234.0 step3
runTheMatrix-results/21234.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D21_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D21+RecoFullGlobal_2023D21+HARVESTFullGlobal_2023D21/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D21_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D21+RecoFullGlobal_2023D21+HARVESTFullGlobal_2023D21.log

27434.0 step3
runTheMatrix-results/27434.0_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D35_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D35+RecoFullGlobal_2023D35+HARVESTFullGlobal_2023D35/step3_TTbar_14TeV+TTbar_14TeV_TuneCUETP8M1_2023D35_GenSimHLBeamSpotFull14+DigiFullTrigger_2023D35+RecoFullGlobal_2023D35+HARVESTFullGlobal_2023D35.log

250202.181 step4
runTheMatrix-results/250202.181_TTbar_13UP18+TTbar_13UP18+PREMIXUP18_PU25+DIGIPRMXLOCALUP18_PU25+RECOPRMXUP18_PU25+HARVESTUP18_PU25/step4_TTbar_13UP18+TTbar_13UP18+PREMIXUP18_PU25+DIGIPRMXLOCALUP18_PU25+RECOPRMXUP18_PU25+HARVESTUP18_PU25.log

  • AddOn:

I found errors in the following addon tests:

cmsDriver.py TTbar_8TeV_TuneCUETP8M1_cfi --conditions auto:run1_mc --fast -n 100 --eventcontent AODSIM,DQM --relval 100000,1000 -s GEN,SIM,RECOBEFMIX,DIGI:pdigi_valid,L1,DIGI2RAW,L1Reco,RECO,EI,VALIDATION --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --datatier GEN-SIM-DIGI-RECO,DQMIO --beamspot Realistic8TeVCollision : FAILED - time: date Sat Dec 8 01:20:31 2018-date Sat Dec 8 01:19:40 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake2,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_Fake2 --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2016 --processName=HLTRECO --filein file:RelVal_Raw_Fake2_DATA.root --fileout file:RelVal_Raw_Fake2_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:22:06 2018-date Sat Dec 8 01:19:46 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:PRef,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_PRef --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_PRef_DATA.root --fileout file:RelVal_Raw_PRef_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:23:38 2018-date Sat Dec 8 01:19:48 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:HIon,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_HIon --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018_pp_on_AA --processName=HLTRECO --filein file:RelVal_Raw_HIon_DATA.root --fileout file:RelVal_Raw_HIon_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:25:21 2018-date Sat Dec 8 01:19:49 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:GRun,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_GRun --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_GRun_MC.root --fileout file:RelVal_Raw_GRun_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:29:57 2018-date Sat Dec 8 01:19:54 2018 s - exit: 21248
cmsDriver.py TTbar_13TeV_TuneCUETP8M1_cfi --conditions auto:run2_mc_l1stage1 --fast -n 100 --eventcontent AODSIM,DQM --relval 100000,1000 -s GEN,SIM,RECOBEFMIX,DIGI:pdigi_valid,L1,DIGI2RAW,L1Reco,RECO,EI,VALIDATION --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --datatier GEN-SIM-DIGI-RECO,DQMIO --beamspot NominalCollision2015 --era Run2_25ns : FAILED - time: date Sat Dec 8 01:20:46 2018-date Sat Dec 8 01:19:56 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:PRef,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_PRef --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_PRef_MC.root --fileout file:RelVal_Raw_PRef_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:27:31 2018-date Sat Dec 8 01:19:58 2018 s - exit: 21248
cmsDriver.py TTbar_13TeV_TuneCUETP8M1_cfi --conditions auto:run2_mc --fast -n 100 --eventcontent AODSIM,DQM --relval 100000,1000 -s GEN,SIM,RECOBEFMIX,DIGI:pdigi_valid,L1,DIGI2RAW,L1Reco,RECO,EI,VALIDATION --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --datatier GEN-SIM-DIGI-RECO,DQMIO --beamspot NominalCollision2015 --era Run2_2016 : FAILED - time: date Sat Dec 8 01:22:13 2018-date Sat Dec 8 01:20:32 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:PIon,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_PIon --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_PIon_DATA.root --fileout file:RelVal_Raw_PIon_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:24:25 2018-date Sat Dec 8 01:20:35 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run1_mc_Fake --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --processName=HLTRECO --filein file:RelVal_Raw_Fake_MC.root --fileout file:RelVal_Raw_Fake_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:27:29 2018-date Sat Dec 8 01:20:48 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:HIon,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_HIon --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018_pp_on_AA --processName=HLTRECO --filein file:RelVal_Raw_HIon_MC.root --fileout file:RelVal_Raw_HIon_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:30:15 2018-date Sat Dec 8 01:22:13 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake2,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_Fake2 --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2016 --processName=HLTRECO --filein file:RelVal_Raw_Fake2_MC.root --fileout file:RelVal_Raw_Fake2_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:28:06 2018-date Sat Dec 8 01:22:17 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake1,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_Fake1 --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_25ns --processName=HLTRECO --filein file:RelVal_Raw_Fake1_MC.root --fileout file:RelVal_Raw_Fake1_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:28:53 2018-date Sat Dec 8 01:23:38 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run1_data_Fake --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --processName=HLTRECO --filein file:RelVal_Raw_Fake_DATA.root --fileout file:RelVal_Raw_Fake_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:27:32 2018-date Sat Dec 8 01:24:54 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:GRun,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_GRun --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_GRun_DATA.root --fileout file:RelVal_Raw_GRun_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:31:49 2018-date Sat Dec 8 01:25:25 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:Fake1,RAW2DIGI,L1Reco,RECO --data --scenario=pp -n 10 --conditions auto:run2_data_Fake1 --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_25ns --processName=HLTRECO --filein file:RelVal_Raw_Fake1_DATA.root --fileout file:RelVal_Raw_Fake1_DATA_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:29:11 2018-date Sat Dec 8 01:27:38 2018 s - exit: 21248
cmsDriver.py RelVal -s HLT:PIon,RAW2DIGI,L1Reco,RECO --mc --scenario=pp -n 10 --conditions auto:run2_mc_PIon --relval 9000,50 --datatier "RAW-HLT-RECO" --eventcontent FEVTDEBUGHLT --customise=HLTrigger/Configuration/CustomConfigs.L1THLT --era Run2_2018 --processName=HLTRECO --filein file:RelVal_Raw_PIon_MC.root --fileout file:RelVal_Raw_PIon_MC_HLT_RECO.root : FAILED - time: date Sat Dec 8 01:31:48 2018-date Sat Dec 8 01:27:42 2018 s - exit: 21248

@cmsbuild
Copy link
Contributor

cmsbuild commented Dec 8, 2018

Comparison not run due to runTheMatrix errors (RelVals and Igprof tests were also skipped)

@bainbrid
Copy link
Contributor Author

Dear @slava77, I see that some of the "Matrix" and "AddOn" tests failed. By inspecting a few random log files, it seems to be related to the missing external"

Exception Message:
edm::FileInPath unable to find file RecoEgamma/ElectronIdentification/data/LowPtElectrons/2018Nov01_improvedfullseeding_formatted.xml.gz anywhere in the search path.

Persumably I should ignore these for now?

@slava77
Copy link
Contributor

slava77 commented Dec 10, 2018

Dear @slava77, I see that some of the "Matrix" and "AddOn" tests failed. By inspecting a few random log files, it seems to be related to the missing external"

Exception Message:
edm::FileInPath unable to find file RecoEgamma/ElectronIdentification/data/LowPtElectrons/2018Nov01_improvedfullseeding_formatted.xml.gz anywhere in the search path.

Persumably I should ignore these for now?

right, the run time failures were expected in this pass.
It's nice to see that the headers checks and the code static analysis returned no issues.

@mrodozov
Copy link
Contributor

please test with cms-sw/cmsdist#4569

@bainbrid
Copy link
Contributor Author

Hi @perrotta @slava77 @fabiocos,

First, I'm testing the timing requirements as requested by @perrotta - results to come.

Second, here what I did for a rebase test. (I'm not a git expert but I think the following is sensible):

cmsrel CMSSW_10_5_X_2019-01-14-2300
cd CMSSW_10_5_X_2019-01-14-2300/src
cmsenv
git cms-init
git remote add bainbrid git@github.com:bainbrid/cmssw.git
git cms-merge-topic bainbrid:LowPtElectrons_10X
cd $CMSSW_BASE/src
git checkout CMSSW_10_5_X_2019-01-14-2300 # puts me in detached HEAD state
git checkout -b CMSSW_10_5_X_2019-01-14-2300
git checkout bainbrid/LowPtElectrons_10X
git rebase --onto CMSSW_10_5_X_2019-01-14-2300 d74dd1812ca HEAD 
git checkout -b LowPtElectrons_10X_rebased

The following command yields no differences (i.e. empty output to stdout):

git diff --name-only from-CMSSW_10_5_X_2019-01-14-2300 LowPtElectrons_10X_rebased

And this command:

git diff --name-only CMSSW_10_5_X_2019-01-14-2300 LowPtElectrons_10X_rebased

yields the following (expected) differences:

Configuration/Eras/python/Modifier_bParking_cff.py
Configuration/StandardSequences/python/Eras.py
Configuration/StandardSequences/python/Reconstruction_cff.py
FastSimulation/Tracking/plugins/ElectronSeedTrackRefFix.cc
FastSimulation/Tracking/plugins/ElectronSeedTrackRefFix.h
FastSimulation/Tracking/python/ElectronSeedTrackRefFix_cfi.py
RecoEgamma/Configuration/python/RecoEgamma_EventContent_cff.py
RecoEgamma/EgammaElectronProducers/BuildFile.xml
RecoEgamma/EgammaElectronProducers/interface/LowPtGsfElectronSeedHeavyObjectCache.h
RecoEgamma/EgammaElectronProducers/plugins/BuildFile.xml
RecoEgamma/EgammaElectronProducers/plugins/LowPtGsfElectronSeedProducer.cc
RecoEgamma/EgammaElectronProducers/plugins/LowPtGsfElectronSeedProducer.h
RecoEgamma/EgammaElectronProducers/plugins/SealModule.cc
RecoEgamma/EgammaElectronProducers/python/lowPtGsfElectronSeeds_cfi.py
RecoEgamma/EgammaElectronProducers/python/lowPtGsfElectronSequence_cff.py
RecoEgamma/EgammaElectronProducers/src/LowPtGsfElectronSeedHeavyObjectCache.cc
RecoParticleFlow/PFTracking/python/trackerDrivenElectronSeeds_cfi.py

If you are happy with this, how do I proceed? Do I force push to the bainbrid:cmssw/LowPtElectrons_10X branch?

Also, how will this affect the other PRs (from @gkaratha and @nancymarinelli) that depend on this branch?

Finally, should I also rebase onto CMSSW_10_2_X for the required back port to CMSSW_10_2? If so, then do I open up a new PR with a LowPtElectrons_102X branch?

@perrotta
Copy link
Contributor

perrotta commented Jan 15, 2019 via email

@fabiocos
Copy link
Contributor

@bainbrid we have a cms-rebase-topic command documented in https://cms-sw.github.io/tutorial-resolve-conflicts.html with some options to make life simpler knowing the old base. But as I said it is not a must, if @perrotta is concerned about possible drawbacks let's move forward as it is

@bainbrid
Copy link
Contributor Author

I'm attempting to rebase onto 10_2_X.
Here are the commands I've used:

cmsrel CMSSW_10_2_X_2019-01-15-1100
cd CMSSW_10_2_X_2019-01-15-1100/src
cmsenv
git cms-init
git remote add bainbrid git@github.com:bainbrid/cmssw.git
git cms-rebase-topic bainbrid:LowPtElectrons_10X

and I see this output:

First, rewinding head to replay your work on top of it...
Generating patches: 100% (2375/2375), done.
Applying: Added new classe HGCalTriggerSums
Applying: Added new classe HGCalTriggerSums
Using index info to reconstruct a base tree...
M	DataFormats/L1THGCal/src/classes.h
M	DataFormats/L1THGCal/src/classes_def.xml
.git/rebase-apply/patch:59: trailing whitespace.

warning: 1 line adds whitespace errors.
Falling back to patching base and 3-way merge...
Auto-merging DataFormats/L1THGCal/src/classes_def.xml
CONFLICT (content): Merge conflict in DataFormats/L1THGCal/src/classes_def.xml
Auto-merging DataFormats/L1THGCal/src/classes.h
CONFLICT (content): Merge conflict in DataFormats/L1THGCal/src/classes.h
error: Failed to merge in the changes.
Patch failed at 0002 Added new classe HGCalTriggerSums
Use 'git am --show-current-patch' to see the failed patch

Resolve all conflicts manually, mark them as resolved with
"git add/rm <conflicted_files>", then run "git rebase --continue".
You can instead skip this commit: run "git rebase --skip".
To abort and get back to the state before "git rebase", run "git rebase --abort".

My changes do not touch anything in DataFormats, so why do I see this message and how should I react?

@bainbrid
Copy link
Contributor Author

Hi @perrotta here are the latest timings from me, which appear to be consistent with your values.

Tight WP, pT > 1 GeV:         Bainbridge        Perrotta
lowPtGsfElectronSeeds         (300.86) +3.68%   (266.42) +3.19%
lowPtGsfEleGsfTracks          ( 84.73) +1.04%   ( 77.35) +0.93%
lowPtGsfEleCkfTrackCandidates ( 11.32) +0.14%   ( 10.51) +0.13%
lowPtGsfElePfTracks           (  6.51) +0.08%   (  6.16) +0.07%
TOTAL                                  +4.93%            +4.32%

@slava77 @perrotta @fabiocos I have rebased the LowPtElectrons_10X branch onto 10_5_X here and 10_2_X here. The latter will be the basis of a new PR. Please can you instruct me on how to proceed?

@perrotta
Copy link
Contributor

Thank you @bainbrid !

The two rebased branches looks correct: you can submit PRs with them.

About your timing results, could you please tell us what has changed since your previous ones?

  • Did you use the very same test setup but a different number of events (10 vs 100)?
  • Did you use a different setup when you run those tests previously? And if so, which setup was used for the validation results (efficiencies, etc.) that you presented last week at the reco meeting?

@bainbrid
Copy link
Contributor Author

@perrotta

Do I make the 102X PR to cms-sw/master or cms-sw/CMSSW_10_2_X?

I've already made the 105X PR to cms-sw/master. I hope this is correct?

@perrotta
Copy link
Contributor

perrotta commented Jan 16, 2019 via email

@perrotta
Copy link
Contributor

@bainbrid please answer the question in #25455 (comment), i.e.:

"About your timing results, could you please tell us what has changed since your previous ones?

  • Did you use the very same test setup but a different number of events (10 vs 100)?
  • Did you use a different setup when you run those tests previously? And if so, which setup was used for the validation results (efficiencies, etc.) that you presented last week at the reco meeting?"

This PR shouldn't be delayed any further if we want it and its descendents in 10_5_0_pre1

@bainbrid
Copy link
Contributor Author

Hi @perrotta,

The efficiency studies presented here in a RECO/AT were performed prior to, and independently of, the timing studies. These studies defined the WPs that have been subsequently used for e.g. the timing studies. The acceptance studies presented here depend solely on the pT threshold.

Concerning my old set of timing studies, they were performed with 100 events, but the _cfg is certainly different from the one I used yesterday to determine the latest set of timings. Perhaps I screwed up something with the old set, but it's not easy for me to say exactly what. What I can say is that the most recent set were based on a fresh working area and our numbers now agree. I have no reason to believe that there is any inconsistency between the efficiencies and timing studies.

Finally, I note that the BDT models in cms-data will be replaced soon with new models trained on 10_2 MC samples. We expect the efficiencies to change, while the timing will remain the ~same. We can repeat these checks if necessary, but we hope this doesn't prevent/delay integration into the release.

@perrotta
Copy link
Contributor

Thank you @bainbrid

My concern was, of course, that physics performance studies were performed on a different setup and working point that the one that will be merged in the release.

My suggestion is to merge this PR as it is now, but to eventually redo the physics performance studies with exactly what is in this PR, just to verify that nothing got screwed. The arrival of the new BDT models could be an appropriate time for making those further studies in a rather similar model.

Moreover, it could be interesting to dig for the difference in the configuration that you used for your timing studies: if the timing improvement was due to some optimization not costly for the physics performance it may be worth checking whether it can be implemented. Also this can happen after the integration of this PR in the release.

@perrotta
Copy link
Contributor

perrotta commented Jan 17, 2019

(removed the "+1" while waiting verification of the usage of reducedEcalRecHits)

  • New gsf electrons, with higher efficiency expected at low pt, are defined here to be used in the reco-ing of the parked b dataset
  • The extra product will be just added to the Event, but it will not be used to build any further higher level object (e.g PF) for the moment
  • The lose working point to be used for the parked dataset implies a rather heavy usage of cpu (it amounts to ~12.4% time of the overall reco + miniAOD time). For the normal workflows a tighter working point was defined, whose cpu usage corresponds to ~4.3% of the total reco + miniAOD processing time: the efficiency for low pt photons get reduced by some 25% with respect to the looser working point, though.
  • The tests show that only the new objects are added in output, while nothing else show differences in reco or miniAOD: jenkins tests report about 12 differences in the reco comparisons from the Phase2 workflows, and they all correspond to the additional warning messages issued by the PFTrackProducer:lowPtGsfElePfTracks module in step3 on top of the same warnings coming from the already existing PFTrackProducer:pfTrack and PFTrackProducer:uncleanedOnlyPfTrack, already present also in the baseline.

@perrotta
Copy link
Contributor

I removed the "+1" while waiting verification of the usage of reducedEcalRecHits instead of full ecalRecHits.
Since there is a rebased version of this PR available, #25679, I suggest to close this one and continue the discussion on the other one: @bainbrid

@perrotta
Copy link
Contributor

-1
(replaced by #25679)

@fabiocos
Copy link
Contributor

@bainbrid @perrotta @slava77 while this PR will stay as documentation of the review, as it is effectively superseded by #25679 I suggest to close it to avoid useless duplications

@fabiocos
Copy link
Contributor

@bainbrid as #25679 has been merged, I would say that this PR may just be closed and kept for reference of the review

@bainbrid
Copy link
Contributor Author

For the record, at the end of this thread:

@bainbrid bainbrid closed this Jan 24, 2019
@bainbrid bainbrid deleted the LowPtElectrons_10X branch August 6, 2019 12:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.