Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EMTF configures from Global Tag conditions in MC as well as data #29260

Merged
merged 1 commit into from
Mar 23, 2020

Conversation

abrinke1
Copy link
Contributor

"master" version of #29252
Intended for all future versions of EMTF emulator.

We have unfortunately uncovered another error in the EMTF emulation for 2016, which had been masked by the previous one. [1] As it turns out the emulator was not accessing the 2016 or 2017 Global Tag conditions for most of the algorithm settings, creating a mix of 2016 or 2017 and 2018 logic which is internally inconsistent. Efe measured the efficiency in the latest 2016 RelVal samples (see attached slides), and due to this bug it is 10% lower in the negative endcap than in the positive.

Thankfully the fix is quite simple, as you can see in this pull request. Efe's plots confirm that the efficiency looks good when re-running the newest RelVals with this patch applied, and is consistent with the efficiency in 2016 data. [2]

For historical background on the likely cause for this bug, between 2016 and 2017 we completely overhauled the EMTF emulator, in part to handle the new RPC inputs. Unfortunately both the CondDB code and payloads for EMTF at that point were in disarray (very long story), and accessing the conditions payload was crashing the MC RelVals. At that point (if my hazy memory serves) we decided to configure by firmware version only for data, and by "fake conditions" for MC, with 3 different "fake conditions" python config files. Sometime subsequently the "fake conditions" for each year went away, so the MC emulation (again, for some but not all of the algorithm settings) used the default settings, instead of the time-dependent firmware version settings. This worked fine when the "default" settings were also the "current" settings (which was true when the 2017 and 2018 MC were produced), but fails for 2016. re-emulation. Long story short, our workflows had never been fully validated for legacy MC, and now we're finding the bugs - for which we apologize.

[1] #29080
[2] https://twiki.cern.ch/twiki/bin/view/CMSPublic/L1TMuonPerformanceICHEP16
EMTF_UL16MC_validation_19.03.2020.pdf

Due to previous issues with CondDB / O2O payloads, the EMTF emulator had configured via firmware version only in data.  Many of the settings in MC emulation defaulted to their 2018 versions, which would cause problems for 2016 and 2017 MC processing.  This fixes the EMTF emulation so it properly uses all 2016 or 2017 conditions for MC emulation.
@cmsbuild
Copy link
Contributor

The code-checks are being triggered in jenkins.

@cmsbuild
Copy link
Contributor

+code-checks

Logs: https://cmssdt.cern.ch/SDT/code-checks/cms-sw-PR-29260/14302

  • This PR adds an extra 16KB to repository

@cmsbuild
Copy link
Contributor

A new Pull Request was created by @abrinke1 for master.

It involves the following packages:

L1Trigger/L1TMuonEndCap

@cmsbuild, @rekovic, @benkrikler can you please review it and eventually sign? Thanks.
@Martin-Grunewald, @thomreis this is something you requested to watch as well.
@davidlange6, @silviodonato, @fabiocos you are the release manager for this.

cms-bot commands are listed here

@rekovic
Copy link
Contributor

rekovic commented Mar 21, 2020

please test

@cmsbuild
Copy link
Contributor

cmsbuild commented Mar 21, 2020

The tests are being triggered in jenkins.
https://cmssdt.cern.ch/jenkins/job/ib-run-pr-tests/5299/console Started: 2020/03/21 07:36

@cmsbuild
Copy link
Contributor

+1
Tested at: bf941e6
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-36ec38/5299/summary.html
CMSSW: CMSSW_11_1_X_2020-03-20-1100
SCRAM_ARCH: slc7_amd64_gcc820

The following merge commits were also included on top of IB + this PR after doing git cms-merge-topic:

You can see more details here:
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-36ec38/5299/git-log-recent-commits
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-36ec38/5299/git-merge-result

@cmsbuild
Copy link
Contributor

Comparison job queued.

@cmsbuild
Copy link
Contributor

Comparison is ready
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-36ec38/5299/summary.html

Comparison Summary:

  • No significant changes to the logs found
  • Reco comparison results: 4 differences found in the comparisons
  • DQMHistoTests: Total files compared: 34
  • DQMHistoTests: Total histograms compared: 2692493
  • DQMHistoTests: Total failures: 147
  • DQMHistoTests: Total nulls: 0
  • DQMHistoTests: Total successes: 2692027
  • DQMHistoTests: Total skipped: 319
  • DQMHistoTests: Total Missing objects: 0
  • DQMHistoSizes: Histogram memory added: 0.0 KiB( 33 files compared)
  • Checked 147 log files, 16 edm output root files, 34 DQM output files

@silviodonato
Copy link
Contributor

silviodonato commented Mar 23, 2020

@rekovic do you have any objection about merging #29260 and #29252?

@silviodonato
Copy link
Contributor

urgent
see #29252

@silviodonato
Copy link
Contributor

merge
in order to have this PR in CMSSW_11_1_X_2020-03-23-1100 tests

@cmsbuild cmsbuild merged commit 5e3f992 into cms-sw:master Mar 23, 2020
@silviodonato
Copy link
Contributor

@abrinke1 are the changes in 250202.181 TTbar_13UP18+PREMIXUP18_PU25+DIGIPRMXLOCALUP18_PU25+RECOPRMXUP18_PU25+HARVESTUP18_PU25 expected?

@rekovic
Copy link
Contributor

rekovic commented Mar 23, 2020

@rekovic do you have any objection about merging #29260 and #29252?

The PR is simple and straight forward, so I am OK with it.
Let's see the comment of @abrinke1 regarding the changes in 250202.181

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants