Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2023LReco and 2023dev scenarios defined up to local reco #13923

Merged
merged 25 commits into from
Apr 14, 2016

Conversation

boudoul
Copy link
Contributor

@boudoul boudoul commented Apr 4, 2016

This is defining the Workflows up to local reco for the 2023LReco (=local reco wfs with run2 / phase2 Flat Tracker as defined in 2023sim) and 2023dev (=run2 but with phase2 Tilted Tracker running now up to local reco )

two comments for the integration (adding @kpedro88 FYI)

  1. At this point we are still using customize before migrating this to era
  2. the local reco is only running the tracker local reco (RECO:trackerlocalreco) since there is a problem with ecal local reco (due to bunchSpacingProducer) but I assume this is going gradually fixed once the other subdetectors will enter into the game

The geometry has been fixed thanks to @ebrondol and the Digitizer thanks to @suchandradutta

How to test :
runTheMatrix.py --what upgrade -l wfs
where wfs are defined like this:
2023LReco (new WFs) 110XX
2023dev : 106XX (as already defined in #13671)
I also checked that the already defined 2023sim is still running fine (108XX)

@delaere , @atricomi FYI

@ianna
Copy link
Contributor

ianna commented Apr 13, 2016

@boudoul - I tried to run this PR with CMSSW_8_1_X_2016-04-13-1100:

git cms-merge-topic 13923
scram b -j 10
runTheMatrix.py -l 10600
processing relval_standard
processing relval_highstats
processing relval_pileup
processing relval_generator
processing relval_extendedgen
processing relval_production
processing relval_ged
ignoring relval_upgrade from default matrix
processing relval_2017
ignoring relval_identity from default matrix
processing relval_machine
processing relval_unsch
processing relval_premix
Running in 4 thread(s)
 tests passed,  failed


@kpedro88
Copy link
Contributor

@ianna - there's an extra option to get the upgrade workflows:

runTheMatrix.py -w upgrade -l 10600

@ianna
Copy link
Contributor

ianna commented Apr 13, 2016

@kpedro88 - Thanks! I picked up the instructions from the discussion in PR#13996 posted on your slides :-)

@kpedro88
Copy link
Contributor

Ah, I hadn't noticed the flag was missing there as well. @boudoul, can you double-check your test from #13996?

@ianna
Copy link
Contributor

ianna commented Apr 13, 2016

@boudoul and @kpedro88 - the crash is fixed in my branch ianna:2023dev-scenario-fix. I think, it was due to overlaps in 2023dev scenario. Now, the workflow needs some conditions for RPC:

Begin processing the 1st record. Run 1, Event 1, LumiSection 1 at 13-Apr-2016 17:11:43.284 CEST
----- Begin Fatal Exception 13-Apr-2016 17:11:52 CEST-----------------------
An exception of category 'DataCorrupt' occurred while
   [0] Processing run: 1 lumi: 1 event: 1
   [1] Running path 'digitisation_step'
   [2] Calling event method for module RPCDigiProducer/'simMuonRPCDigis'
Exception Message:
Exception from RPCSimSetUp - no noise information for DetId 637567042
----- End Fatal Exception -------------------------------------------------

BTW, @boudoul - I could not push it to you branch so, you can see it here #14053

@kpedro88
Copy link
Contributor

@ianna - Thanks!

@boudoul - You can probably cherry-pick 9833a29 into #13923 and test it there...

@bsunanda - FYI

@ianna
Copy link
Contributor

ianna commented Apr 13, 2016

@kpedro88 - what was another crash? The link on your slides does not work for me.

@kpedro88
Copy link
Contributor

@ianna - both links were discussions of the same crash. Here they are without Powerpoint messing them up:
#13996 (comment)
#13992 (diff)

@calabria
Copy link
Contributor

for the RPC conditions we used this customization in the past: SLHCUpgradeSimulations/Configuration/fixMissingUpgradeGTPayloads.fixRPCConditions. for the moment this can help us running but I think this has to be ported in the era paradigm too. I contact the RPC responsibles.

@kpedro88
Copy link
Contributor

@calabria thanks, I think it is important to get that into Eras quickly. Though, I admit, I'm not sure exactly how we want to put process.GlobalTag.toGet.extend into Eras. @Dr15Jones, any thoughts?

https://github.com/cms-sw/cmssw/blob/CMSSW_8_1_X/SLHCUpgradeSimulations/Configuration/python/fixMissingUpgradeGTPayloads.py

@cvuosalo
Copy link
Contributor

+1

For #13923 75b2654

Defining workflows for 2023LReco and 2023dev up to local reco. There should be no change in standard workflows.

The code changes are satisfactory, and Jenkins tests against baseline CMSSW_8_1_X_2016-04-12-2300 show no significant differences, as expected.

@davidlange6
Copy link
Contributor

hi @boudoul - @Degano added a new data package for your txt files - could you remove them and update this pull request and then I think its ready to go.

@cmsbuild
Copy link
Contributor

Pull request #13923 was updated. @civanch, @Dr15Jones, @cvuosalo, @ianna, @mdhildreth, @fabozzi, @cmsbuild, @srimanob, @franzoni, @slava77, @hengne, @davidlange6 can you please check and sign again.

@boudoul
Copy link
Contributor Author

boudoul commented Apr 14, 2016

Thanks @Degano ! I removed the files

@davidlange6
Copy link
Contributor

@ianna - could you turn your commit into a PR instead?

@davidlange6 davidlange6 merged commit f1e04a3 into cms-sw:CMSSW_8_1_X Apr 14, 2016
@ianna
Copy link
Contributor

ianna commented Apr 15, 2016

@davidlange6 - the PR is there, just waiting for IB to rebase it: #14053

@kpedro88
Copy link
Contributor

@calabria - I just noticed that those RPC customizations are included in #13992 already (in SimMuon/Configuration and SimMuon/RPCDigitizer).

@calabria
Copy link
Contributor

@kpedro88 ah great! so that issue is already fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.