-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Include ZDC in L1T re-emulation workflows #43214
Comments
A new Issue was created by @missirol Marino Missiroli. @Dr15Jones, @sextonkennedy, @smuzaffar, @rappoccio, @antoniovilela, @makortel can you please review it and eventually sign/assign? Thanks. cms-bot commands are listed here |
assign l1 |
New categories assigned: l1 @epalencia,@aloeliger you have been requested to review this Pull request/Issue and eventually sign? Thanks |
Might be useful to tag them here as well if you know the gutHub accounts. |
I have a first draft of a ZDC packer here (https://github.com/matt2275/cmssw/tree/ZDCUnpacker/EventFilter/L1TRawToDigi). I have not done any testing since I'm not sure how to do that. One way of testing this would be to produce two sets of digis one that was only unpacked once and the other that was unpacked, repacked and unpacked again. If the sets are identical, it's a good sign the packer is working properly. Unfortunately, I'm not sure how to go about doing that. |
@cms-sw/l1-l2 What's the status of this issue ? |
What's the status of this issue? |
IIUC
was done at #44019 What about the item
|
tagging also @Michael-Krohn, @cfmcginn, @abdoulline |
I hope @Michael-Krohn can correct me, but my understanding of the caveat with ZDC TP emulation ( #42818 ) is the following:
Because:
|
Yes, that is also my understanding. Assuming that the geometry issues are fixed by @bsunanda, are there still other changes that we need to do for this? |
The example in [*] for Is the test in [*] with [*] #!/bin/bash
# CMSSW_14_1_0_pre5
execmd="hltGetConfiguration /dev/CMSSW_14_0_0/HIon --no-prescale --no-output --max-events 100"
execmd+=" --paths HLTriggerFirstPath,HLTriggerFinalPath,HLTAnalyzerEndpath"
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseHLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
> hltData_ref.py && cmsRun hltData_ref.py &> hltData_ref.log
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseL1THLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
--eras Run3 --l1-emulator uGT \
> hltData_L1uGT.py && cmsRun hltData_L1uGT.py &> hltData_L1uGT.log
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseL1THLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
--eras Run3 --l1-emulator Full \
> hltData_L1Full.py && cmsRun hltData_L1Full.py &> hltData_L1Full.log |
Hi @missirol , I would assume that Let me ping @aloeliger @epalencia directly and hopefully they can have a look at this. |
Hi All, Thanks for your help with this. Yes, it would be good to have a look and discuss as needed. We have some workforce that could help make the required changes as long as we have a good understanding of what is needed. Please let me know if it would be helpful to discuss offline. |
I'm not sure whether Actual ZDC DB conditions contain Run1 legacy constants and recently (14_0) ZDCDetId was changed. |
I have not tried to run the ZDC TP emulation as is (with the Run1 legacy DB conditions and the emulation in 14_0), but I am not surprised that there are zero events passing the ZDC trigger. In our current version of the condition HcalLutMetadata there are no ZDC channels. This condition is used to construct the output LUT for all channels, including the ZDC ones. Where specifically a calibration factor is grabbed from that file (https://github.com/cms-sw/cmssw/blob/master/CalibCalorimetry/HcalTPGAlgos/src/HcaluLUTTPGCoder.cc#L633) and multiplied to the output energy of each channel. With this value not set for the ZDC channels in the current conditions, I'm not sure what value is being multiplied here but if it defaults to 0 then the output energy for every EM and HAD channel in the emulation would be 0. |
Afaiu, |
@eyigitba , I think that's indeed the case, see #45712. Note that even with #45712 the results of |
Okay, I finally have some time to work on this. @missirol, is this recipe: #43214 (comment) still accurate for recreating the problem? |
@missirol I am unable to recreate this particular check on lxplus for CMSSW_14_1_0_pre5. It fails with the error: |
This error means that In the |
I think the issue is simpler. I'll have a look, and provide an updated recipe.
I think so. The ConfDB db->python converter has some logic to add the proper |
yes.
that happens because |
The updated recipe is in [*], tested with [*] #!/bin/bash
# CMSSW_14_1_1
execmd="hltGetConfiguration /dev/CMSSW_14_1_0/HIon/V26 --no-prescale --no-output --max-events 100"
execmd+=" --paths HLTriggerFirstPath,HLTriggerFinalPath,HLTAnalyzerEndpath"
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseHLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
> hltData_ref.py && cmsRun hltData_ref.py &> hltData_ref.log
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseL1THLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
--eras Run3 --l1-emulator uGT \
> hltData_L1uGT.py && cmsRun hltData_L1uGT.py &> hltData_L1uGT.log
${execmd} \
--customise HLTrigger/Configuration/CustomConfigs.customiseL1THLTforHIonRepackedRAWPrime \
--input file:/eos/cms/store/user/cmsbuild/store/hidata/HIRun2023A/HIPhysicsRawPrime0/RAW/v1/000/375/491/00000/de963321-c0a0-49fb-b771-1a312a69db03.root \
--globaltag 140X_dataRun3_HLT_v3 \
--data \
--eras Run3 --l1-emulator Full \
> hltData_L1Full.py && cmsRun hltData_L1Full.py &> hltData_L1Full.log |
@missirol Okay, I see the issue, and the surface level answer is both simple and unsatisfying. the cmssw/Configuration/StandardSequences/python/SimL1EmulatorRepack_uGT_cff.py Lines 24 to 50 in 4a85623
By nature, I am inclined to trust the thing with the fewest moving parts here, the Digging a little deeper, and looking at the two logs, there are inaccuracies in SingleJet seeds, surprisingly no inaccuracies in single or double EG seeds, small inaccuracies in muon seeds (I saw 1 disagreement for The muon and EG thing is telling. The thing the worst performers have in common is that they are all ultimately fed by unpacked HCAL trigger primitives, and the re-emulation starts from there. cmssw/Configuration/StandardSequences/python/SimL1EmulatorRepack_Full_cff.py Lines 127 to 129 in 4a85623
(Slightly curious to me here, is that this is cmssw/Configuration/StandardSequences/python/SimL1EmulatorRepack_Full_cff.py Lines 131 to 132 in 4a85623
It is not out of the realm of possibility, that both calo triggers and ZDC emulation both have emulation inaccuracies, but the common factor here is the HCAL trigger primitive unpacking, so I suspect ultimately that what this is, is an HCAL unpacking bug. The main portion of HCAL unpacker itself seems to have been unchanged for 4 years or so (https://github.com/cms-sw/cmssw/tree/master/EventFilter/HcalRawToDigi/plugins), so unless we've been suffering this for the last 4 years or so, we really need to get an HCAL expert to talk about what might have changed in the FED(s), or in the unpacking chain. |
Actually, that does have an effect, which is odd to me. Changing it to |
Tagging HCAL trigger experts @JHiltbrand and @Michael-Krohn so that they can take a look. |
After discussing with the DPG, we'll try looking at simulated and unpacked TPs, to see if that makes sense to us. |
Hi @aloeliger, @slaurila, Thanks for the discussion yesterday at the L1 SW meeting. Referring to @aloeliger's last message, the HCAL trigger primitive collection is not "assigned" an explicit instance name [1], whereas the ECAL trigger primitive collection is [2]. So it makes sense that removing the ECAL TP instance name crashes things - although it is then not clear why including a supposedly-non-existent HCAL TP instance name does not crash things, and does something... 🤔 As an aside, in HCAL TRG when we compare unpacked to re-emulated HCAL TPs we find quite good agreement. Thus, if there was an issue in the unpacker, it would have to affect the unpacking of TPs and QIE data frames (consumed for TP re-emulation) in such a way that the good agreement is not spoiled, hence I am skeptical of it being an unpacker issue. I would be happy to cross-check anything more related to HCAL, let me know. [1] https://cmssdt.cern.ch/lxr/source/EventFilter/HcalRawToDigi/plugins/HcalRawToDigi.cc#0061 |
@JHiltbrand perhaps you and I should discuss more, because a quick re-emulation on I don't doubt you when you say HCAL has good reason to believe this is not broken, but perhaps there is some subtle configuration issue L1T is having or some irrelelvant-TP-discarding rule I may not be aware of that causes these to look different. |
Hi @aloeliger , Thanks for the plot. One thing with re-emulation vs unpacked TP comparisons is the effect of ZS. That is to say, when re-emulating HCAL TPs, the QIE data frames used as inputs nominally have some ZS "baked-in"---ZS which had been applied to the DAQ readout pipeline, but is not applied to the L1T pipeline when forming the "hardware" or "packed" TPs. As a cross-check, you could rerun what you did above, but run on an [1] |
Weeks ago, the L1-uGT emulator (#42634) and unpacker (#42733) were updated to make use of data from the Zero-Degree Calorimeter, for the 2023 HIon run (and beyond). These updates were the necessary ones to be able to correctly take data online with a L1T menu including algorithms using ZDC data.
Other updates needed to properly test/re-emulate such L1T menus offline are still missing. For what I understand, this includes (at least) the following.
L1TDigiToRaw
.SimL1EmulatorRepack_Full_cff
.As long as these updates are missing, any re-emulation of the L1T results in standard workflows will return incorrect results for any L1T algorithms using ZDC (again, for what I understand).
FYI: @cms-sw/hlt-l2
The text was updated successfully, but these errors were encountered: