Skip to content

kimsw9284/SUSYBSMAnalysis-HSCP

 
 

Repository files navigation

Heavy Stable Charged Particle

Table of Contents

  1. Setup working area
  2. Run the code
  3. Compute Lumi

Setup working area

export SCRAM_ARCH=slc7_amd64_gcc700
cmsrel CMSSW_10_6_30
cd CMSSW_10_6_30/src/
cmsenv

For the following step you should have a ssh key associated to your GitHub account. For more information, see connecting-to-github-with-ssh-key.

git clone -b master [email protected]:Christinaw97/SUSYBSMAnalysis-HSCP.git SUSYBSMAnalysis 

To compile the code, run

cd SUSYBSMAnalysis
scram b -j

Run the code

Steps:
Step 0 Produce EDM-tuples from AOD
Step 1 Produce bare ROOT-tuples (histograms and trees) from step 0
Step 2 Estimate Background using histograms from Step 1
Step 3 Make plots
Step 4 Compute Limits

Step 0

Get the main script first

cp HSCP/test/HSCParticleProducer_cfg.py .

Have a look at HSCParticleProducer_cfg.py to see all available options.

Get proxy

voms-proxy-init --voms cms -valid 192:00

Run locally

Before running locally:

#file exists? Note that site of type "TAPE" has no user access
dasgoclient -query="site file=/store/data/Run2017B/MET/AOD/09Aug2019_UL2017_rsb-v1/00000/AA1FC1E6-1E88-204D-B867-4637AEAC4BEA.root"

How to run

cmsRun HSCParticleProducer_cfg.py LUMITOPROCESS=HSCP/test/Cert_294927-306462_13TeV_UL2017_Collisions17_GoldenJSON.txt inputFiles=root://cms-xrd-global.cern.ch//store/data/Run2017B/MET/AOD/09Aug2019_UL2017_rsb-v1/00000/AA1FC1E6-1E88-204D-B867-4637AEAC4BEA.root

You can also use inputFiles_load=input.txt, where input.txt contains a list of files.

Don't forget to copy needed files:

cp HSCP/data/CorrFact*Pix*.txt .
cp HSCP/data/template*.root .
cp HSCP/data/MuonTimeOffset.txt .

Run on Grid using crab

Setup CRAB environment

source /cvmfs/cms.cern.ch/common/crab-setup.sh  

Get CRAB configuration file to run on Data:

cp HSCP/test/submitToCrab.py .
python submitToCrab.py -h #for help

Important Replace config.Site.storageSite = 'T2_FR_IPHC' with the site where you have permission to write. You can check permission by running the following command: crab checkwrite --site=<site-name>.

You can now run on Dataset (or input files)

python submitToCrab.py --dataset <Dataset> --name <request-name> --sample <isData> --lumiToProcess <JSON file>

The following directory will be created: crab_projects/crab_<request-name>.

  • To get status: crab status -d crab_projects/crab_<request-name>
  • To resubmit (killed and failed jobs): crab resubmit -d crab_projects/crab_<request-name>
  • To retrieve the output: crab getoutput -d crab_projects/crab_<request-name> [--jobids id1,id2]
  • To get report (processed lumi json): crab report -d crab_projects/crab_<request-name>
  • How to get corresponding integrated lumi, see section Compute Lumi.

Important Running crab with --dryrun option provides

  1. jobs splitting
  2. estimates of runtime and memory consumption. To show only splitting results, add the --skip-estimates option. You can then run crab proceed -d crab_projects/crab_<request-name>.

Step 1

Analysis Type:
Type 0 Tk only
Type 1 Tk+Muon
Type 2 Tk+TOF
Type 3 TOF only
Type 4 Q>1
Type 5 Q>1

EDAnalyzer on top of EDM files (created during the previous step)

Main script: cp Analyzer/test/HSCParticleAnalyzer_cfg.py .

cmsRun HSCParticleAnalyzer_cfg.py inputFiles=file:HSCP.root maxEvents=100

Or, if there are many files:

ls HSCP*.root|sed 's/^/file:/'>list.txt
cmsRun HSCParticleAnalyzer_cfg.py inputFiles_load=list.txt

Production of EDM files (on-fly) and run of the EDAnalyzer

Main scripts:

cp Analyzer/test/HSCParticleProducerAnalyzer_cfg.py .
cp Analyzer/test/crabConfigProdAnalyzer_Data.py .
#just like previously...
cmsRun HSCParticleProducerAnalyzer_cfg.py LUMITOPROCESS=HSCP/test/Cert_294927-306462_13TeV_UL2017_Collisions17_GoldenJSON.txt inputFiles=root://cms-xrd-global.cern.ch//store/data/Run2017B/MET/AOD/09Aug2019_UL2017_rsb-v1/00000/AA1FC1E6-1E88-204D-B867-4637AEAC4BEA.root

To use crab:

python crabConfigProdAnalyzer_Data.py

Check of the EDAnalyzer (comparison with the old workflow)

Copy the script:

cp Analyzer/test/compareRootFiles.py .

For more details, execute the following python compareRootFiles.py -h. This script takes two root files (to set in the file) and compares their histograms with a Kolmogorov test. Any difference is saved in:

Analyzer/test/differences.txt

Step 2

Background prediction

List your root files in a single text file, e.g input.txt

Run locally

BackgroundPrediction -h # for help
BackgroundPrediction -f input.txt

Run on HTCondor

Compute Lumi

From a new terminal

  1. Setup the CMS environment
  2. Install brilws (in your home directory) as follows
    export PATH=$HOME/.local/bin:/cvmfs/cms-bril.cern.ch/brilconda/bin:$PATH
    pip install --user brilws
    Some tests
    brilcalc --version # to check the installation
    brilcalc lumi --help # for help
  3. Compute lumi
    brilcalc lumi -c web -i Cert_294927-306462_13TeV_UL2017_Collisions17_GoldenJSON.txt --output-style csv -u /pb > result.txt

About

HSCP analysis code

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 45.9%
  • Python 39.9%
  • C 14.1%
  • Shell 0.1%