Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replay to test addition of BeamSpotOnline tags to Prompt GT #4817

Closed

Conversation

francescobrivio
Copy link
Contributor

@francescobrivio francescobrivio commented May 9, 2023

Replay Request

Requestor
AlCaDB (FYI @mmusich as ORM)

Describe the configuration

  • Release: CMSSW_13_0_5_patch2
  • Run: 367100 (Collisions 2023 - 1200b - 2.5h long - all components IN)
  • GTs:
    • expressGlobalTag: 130X_dataRun3_Express_v2 (unchanged)
    • promptrecoGlobalTag: 130X_dataRun3_Prompt_v3 (NEW!)
  • Additional changes:
    • New Prompt GT which contains BeamSpotOnline tags
    • Run only Prompt reco on ZB streams (ingore everything else)
      • @germanfgv I have used the specifyStreams feature - but i'm not sure if it is used correctly, can you check?

Purpose of the test
This Replay is to test the addition of the BeamSpotOnline tags to the Prompt GT following discussion in cms-sw/cmssw#41459. Adding these new tags will allow to:

  • reduce LogErrors and LogWarnings in tier0 prompt logs
  • allow to compare online and offline beamspot in Prompt DQM (this feature was already there in Run2 - but not available anymore so far in RUn3 since the removal of SCAL)

Following recent discussions at the JointOps meeting we prepared an ad hoc GT for this replay:

  • 130X_dataRun3_Prompt_v3
    • with infinite snapshot
    • diff from current Prompt GT here

T0 Operations cmsTalk thread
https://cms-talk.web.cern.ch/t/replay-for-testing-addition-of-beamspotonline-tags-to-prompt-gt/23848

@francescobrivio
Copy link
Contributor Author

test syntax please

@francescobrivio
Copy link
Contributor Author

test syntax please

@germanfgv
Copy link
Contributor

Once we agree on a configuration, @LinaresToine please manually deploy this replay. You can use vocms0500.

@francescobrivio
Copy link
Contributor Author

test syntax please

@germanfgv
Copy link
Contributor

@LinaresToine tried to start the replay, but run 367102 does not include PhysicsZeroBias* streamers:
https://dmytro.web.cern.ch/dmytro/cmsprodmon/tier0_details.php?run=367102

@francescobrivio other suggestion?

@francescobrivio
Copy link
Contributor Author

@LinaresToine tried to start the replay, but run 367102 does not include PhysicsZeroBias* streamers: https://dmytro.web.cern.ch/dmytro/cmsprodmon/tier0_details.php?run=367102

@francescobrivio other suggestion?

We need to find another run with ZB for which we still have the streamers.
My first suggestion would be to go back to 367100 as I originally requested - it's a long-ish run, but we do run only on ZB and only Prompt-reco in this replay, so overall it shouldn't take too much time/ressources.

Also, I think that the suggestion of keeping this run instead of 367100 (see https://cms-talk.web.cern.ch/t/streamers-to-keep-for-2023-1200b-run/23845/3) it's wrong. FYI @mmusich

@germanfgv
Copy link
Contributor

367100 does not have PhysicsZeroBias either

@mmusich
Copy link
Contributor

mmusich commented May 9, 2023

My first suggestion would be to go back to 367100 as I originally requested - it's a long-ish run, but we do run only on ZB and only Prompt-reco in this replay, so overall it shouldn't take too much time/ressources.

Just to test this, any PD will work really....

@francescobrivio
Copy link
Contributor Author

367100 does not have PhysicsZeroBias either

ok the ZB dataset is actually there, let me just figure out the correct stream name :)

@mmusich
Copy link
Contributor

mmusich commented May 9, 2023

Also, I think that the suggestion of keeping this run instead of 367100 ... it's wrong

Why?

@francescobrivio
Copy link
Contributor Author

ok scratch everything.
Run 367102 is good - the ZB dataset comes from the PhysicsCommissioning stream. So we just need to update the specifyStreams command and we can run on run 367102 as originally planned

@francescobrivio
Copy link
Contributor Author

@germanfgv @LinaresToine can you take care of it? I'll be offline for a bit now

@LinaresToine
Copy link
Contributor

LinaresToine commented May 9, 2023 via email

Update run to 367102 and specifyStreams to PhysicsCommissioning
Fix name error on PhysicsCommissioning
@francescobrivio
Copy link
Contributor Author

I see from the Grafana monitoring that all jobs finished successfully and we (together with Marco) checked the output DQM plots: e.g. see the difference between this plot from the replay and the same plot from standard prompt:

  • replay plot has more entries (due to BSOnline having more IOVs)
  • the mean is slightly different (expected due to comparing different beamspot values)
  • Note to self: these DQM plots need some more improvements (to be done in a followup PR in CMSSW)

@francescobrivio
Copy link
Contributor Author

I have also run a check on the LogErrorMonitor skim from this replay with an adjusted version of the recipe suggested by Slava in cms-sw/cmssw#41456 (comment).
Make a print_warnings.py as:

import ROOT
f = open("zb.ews.pyscan.txt", "a")
en = ROOT.TChain("Events")
n = en.Add("root://cms-xrd-global.cern.ch//store/backfill/1/data/Tier0_REPLAY_2023/ZeroBias/USER/LogErrorMonitor-PromptReco-v9185423/000/367/102/00000/b564e4d9-2faf-4bf8-b125-68a
d3de62dc8.root")
for ev in range(en.GetEntries()):
  n = en.GetEntry(ev)
  mv = en.edmErrorSummaryEntrys_logErrorHarvester__RECO.product()
  e = en.EventAuxiliary
  for mm in mv: print(e.run(), e.luminosityBlock(), e.event(), mm.severity.getName().data(), mm.count, mm.module, mm.category, file=f)
f.close

then:

python3 print_warnings.py
grep "BeamSpotFromDB" zb.ews.pyscan.txt

this results in 0 occurences of warnings/errors from the BS module.

@LinaresToine
Copy link
Contributor

The logs of the requested replay may be found in the path

/afs/cern.ch/user/c/cmst0/public/BeamSpotOnline

@francescobrivio
Copy link
Contributor Author

/afs/cern.ch/user/c/cmst0/public/BeamSpotOnline

Thank you @LinaresToine!
I took a look at few of the logs and also here there are 0 occurences of BeamSpotFromDB in the logs.
@mmusich I would consider this successful and let you handle the deployement in production of the new Prompt GT 130X_dataRun3_Prompt_v3, is that ok?

@mmusich
Copy link
Contributor

mmusich commented May 10, 2023

I would consider this successful and let you handle the deployement in production of the new Prompt GT 130X_dataRun3_Prompt_v3, is that ok?

well, OK, but this then brings back the question of deployment :

  • do we need another (full-fledged) replay, or can we just update the configuration directly? @germanfgv please advise.

@francescobrivio
Copy link
Contributor Author

no - because in this replay we are already using the v3 GT, right?

@mmusich
Copy link
Contributor

mmusich commented May 10, 2023

no - because in this replay we are already using the v3 GT, right?

I posted a question here

@germanfgv
Copy link
Contributor

Taking into account the minor differences betweran this tag and v2
https://cms-conddb.cern.ch/cmsDbBrowser/diff/Prod/gts/130X_dataRun3_Prompt_v3/130X_dataRun3_Prompt_v2

I don't see any reason to perform another replay. I'll prepare the PR for production.

@francescobrivio
Copy link
Contributor Author

Closing since the new GT was deployed online:
https://cms-talk.web.cern.ch/t/t0-production-prompt-gt-moved-to-130x-datarun3-prompt-v3/23924

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants