README

Introduction

This is the primary analysis code for the full Run-2 **Mono-Higgs (bb)** (Analysis TWiki) search for dark matter.

The aim is to implement CP recommendations and perform analysis selections to create histograms/trees from which final results are synthesized. This code is based on XAMPPbase (XAMPP TWiki).

If this is your first time working with the XAMPP framework, have a look into the tutorial.

Setup

info sign It is recommended to use the convenience wrapper scripts. They extract the installation/setup/update instructions from this README file and save you some error-prone typing or copying of commands.

Installing XAMPPmonoH (initial setup)

Start by setting up git and your kerberos credentials if on lxplus (NOTE : <USERNAME> is your personal username):

setupATLAS
lsetup git
kinit <USERNAME>@CERN.CH

Next check out the code recursively to include the submodules (NOTE : This is using the ssh authentication URL, but you can use any of the https, ssh, or krb5, as you prefer):

# prepare shell by sourcing ATLAS local setup, set up git, create directory structure
source /cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase/user/atlasLocalSetup.sh                           #!
lsetup git
mkdir analysis
cd analysis
# recursively clone analysis code with all submodules
git clone ssh://git@gitlab.cern.ch:7999/atlas-mpp-xampp/XAMPPmonoH.git --recursive source -b master
cd source
# set up ATLAS software (analysis release is specified in this line)
asetup AthAnalysis,$(grep FROM Dockerfile | cut -d : -f 2),here                                       #!
cd .. && mkdir -p build run                                                                           #!
# compile project
cd build                                                                                              #!
cmake ../source && make                                                                               #!
cd ../source                                                                                          #!
# export environment variables of project
source ../build/x86*/setup.sh                                                                         #!

The source folder will then act as your XAMPPmonoH project (all of this is defined by --recursive source) and includes the file .git of XAMPPmonoH.

In this way you will keep the recommended structure of run source build (see https://twiki.cern.ch/twiki/bin/view/AtlasComputing/SoftwareTutorialxAODAnalysisInCMake). Note that in contrast to the example shown in the link the TestArea will be created in the top level directory analysis and not in the build directory analysis/build.

info sign

Recommended install script: you can also use the convenience install script. Follow these instructions instead:

# prepare shell by sourcing ATLAS local setup, set up git, create directory structure
source /cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase/user/atlasLocalSetup.sh
lsetup git
mkdir analysis
cd analysis
# recursively clone analysis code with all submodules
git clone ssh://git@gitlab.cern.ch:7999/atlas-mpp-xampp/XAMPPmonoH.git --recursive source -b master
cd source
# execute install script
source XAMPPmonoH/scripts/install.sh

Setting up XAMPPmonoH (after the initial setup)

After the initial setup, you can prepare your session by following these instructions:

# prepare shell by sourcing ATLAS local setup
cd analysis/source
# set up ATLAS software
source /cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase/user/atlasLocalSetup.sh                           #=
asetup AthAnalysis,$(grep FROM Dockerfile | cut -d : -f 2),here                                       #=
# export environment variables of project
source ../build/x86*/setup.sh                                                                         #=

info sign

Recommended setup script: you can also use the convenience setup script. Follow these instructions instead:

cd analysis/source
source XAMPPmonoH/scripts/setup.sh

Updating the XAMPPmonoH code (after the initial setup)

Updating the code after the initial setup can be done following these instructions:

cd analysis/source
cd ..                                                                                                #&
# prepare shell by sourcing ATLAS local setup, set up git, set up ATLAS software
source /cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase/user/atlasLocalSetup.sh                          #&
lsetup git                                                                                           #&
cd source                                                                                            #&
asetup AthAnalysis,$(grep FROM Dockerfile | cut -d : -f 2),here                                      #&
# update git code and submodules
git pull && git submodule init && git submodule update --recursive                                   #&
# compile project
cd ../build                                                                                          #&
cmake ../source && make                                                                              #&
cd ../source                                                                                         #&
# export environment variables of project
source ../build/x86*/setup.sh                                                                        #&

info sign

Recommended update script: you can also use the convenience update script. Follow these instructions instead:

cd analysis/source
source XAMPPmonoH/scripts/update.sh

Submodules

This package uses submodules to checkout the direct contents of this package (XAMPPmonoH) in addition to the package dependencies:

When you check out this single project recursively, you are checking out the entire set of code (aside from the athena release) necessary for the project. In particular, you are checking out a stable version of each submodule project, specified by the revision hash.

Click to toggle the content "Using a different submodule version" If you wish to use a different submodule version in the context of this analysis, then it is necessary to follow these steps * Setup the code as described in [Installing XAMPPmonoH](https://gitlab.cern.ch/atlas-mpp-xampp/XAMPPmonoH/blob/master/README.md#installing-xamppmonoh-initial-setup) * Remove the subproject that you wish to develop (e.g. AnalysisTools) * `rm AnalysisTools` * Checkout the version (branch or revision) of that project noting that you must clone it recursively to ensure you obtain all of its submodules. * `git clone --recursive https://:@gitlab.cern.ch:8443/MonoHbb/2017/AnalysisTools.git` * `cd AnalysisTools` * `git checkout ` When you perform a `cmake ../source && make` compilation routine next time, this version of that code will be included and you can develop and push to it and submit merge requests as desired. If you want to get the initial version associated to a certain hash of XAMPPmonoH: * `git submodule update --init`

Running the Code

Interactive Jobs

An interactive job can be run using the ``runAthena.py` <https://gitlab.cern.ch/atlas-mpp-xampp/XAMPPmonoH/blob/master/XAMPPmonoH/python/runAthena.py>`_ script:

cd run/
python ../source/XAMPPmonoH/python/runAthena.py --jobOption XAMPPmonoH/runMonoH.py --testJob  --filesInput <MyInfile.root>

Where:

  • --jobOption XAMPPmonoH/runMonoH.py is the used jobOptions file, in this case for the zero lepton analysis.
  • --filesInput <MyInfile.root> is the input file, e.g.: root://eoshome.cern.ch//eos/user/x/xmonoh/ci/mc16_13TeV.410470.PhPy8EG_A14_ttbar_hdamp258p75_nonallhad.deriv.DAOD_EXOT24.e6337_e5984_s3126_r9364_r9315_p3563/DAOD_EXOT24.14183669._000678.pool.root.1

Grid Jobs

A single grid job (or multiple) can be submitted using the XAMPPbase/python/SubmitToGrid.py (XAMPPmonoH/python/SubmitToGrid.py) script. Before using this script, it is necessary to setup the proper grid authentication tools :

localSetupRucioClients
localSetupPandaClient
voms-proxy-init -voms atlas

After this, you can use the script as (before: cd source):

python XAMPPbase/XAMPPbase/python/SubmitToGrid.py --jobOptions XAMPPmonoH/runMonoH.py -i <inputfile or list> --outDS <some name or version of output DS>

Or (this requires a list of samples, see for example this link):

python ${TestArea}/XAMPPmonoH/python/SubmitToGrid.py --jobOptions XAMPPmonoH/runMonoH.py --list <listToGrid.txt> --production <versionString>

For more information about run arguments, do: python ${TestArea}/XAMPPbase/XAMPPbase/python/SubmitToGrid.py --help

Before launching a new production

Please have a look to the checklist!!!

There are also some tips and good practices for monitoring a grid production.

Bookkeeping

List of Samples

A full list of samples (data and MC) can be found here.

The derivation ``EXOT27` <https://gitlab.cern.ch/atlas/athena/blob/21.2/PhysicsAnalysis/DerivationFramework/DerivationFrameworkExotics/share/EXOT27.py>`_ is used.

Creating new sample lists

To facilitate the error-prone task of creating new sample lists, a script can be used which checks the status of samples according to rucio and AMI and composes a sample list using a specified tag for the derivations.

To execute the script set up your TestArea first using asetup and set up panda, AMI and rucio while requesting a grid proxy.

localSetupRucioClients
localSetupPandaClient
voms-proxy-init -voms atlas

Then execute the script using

python ${TestArea}/XAMPPmonoH/scripts/CreateSampleList.py

The necessary configuration settings are maintained in two files Sample list: XAMPPmonoH/data/samplelist.txt In this file all DSIDs of background processes considered in our search are documented together with a human-readable name. In addition in the same line as the DSID some options can be written to write the file to the list commented out or only be written to lists for certain derivations.

Configuration file: XAMPPmonoH/data/samplelist_config.json In this file the required derivations, rtags and ptags as well as other settings are configured.

Making Results

Produce a CutFlow

First, make sure you set SetupAlgorithm().RunCutFlow = True in your job options file.

To get the numbers printed on your screen:

cd run/
python ${TestArea}/XAMPPbase/XAMPPbase/python/printCutFlow.py -i <INPUTFILE generated above, e.g. MyOutFile.root> -a <ANALYSIS region, e.g. 0L_SR_Resolved or 0L_SR_Merged>

Tree Slimming

In order to reduce the ouput size of the trees two cuts are applied during the event selection (see here):

  • MET trigger fired (0 lepton) or MET OR Single Muon Trigger fired (1 lepton)
  • MET/METnoMu/pt(ll) > 150 GeV / 500 GeV (0 lepton / 1 lepton / 2 lepton), (resolved / merged)
  • Lepton requirement (veto for 0 lepton, single muon for 1 lepton)
  • 2 central small-R jets and 1 or more b-tags || 1 large-R jet
  • tau and extended tau vetos
  • mjj/mJ > 40 GeV

The tree slimming can be (de)activated by setting SetupMonoHAnaConfig().doProduction = True.

Development

Standard Development

If you would like to include your work in the codebase then we follow the

  • Make a branch of XAMPPmonoH in the web browser : XAMPPmonoH/branches
  • Clone the repository as you normally would
    • git clone --recursive ...
  • Go into the repo and checkout the branch you will develop on
    • git checkout [YOUR_BRANCH]
  • Develop as you normally would.
    • The submodules inside the project will not be affected.
    • You are compiling from the build/ directory so this will not affect the code you are developing in the top level XAMPPmonoH directory
  • When you are finished developing on your branch commit and push the changes
    • git add FILES_TO_ADD
    • git commit FILES_TO_ADD -m "Descriptive commit message"
    • git push
  • Go back to the branch browser (XAMPPmonoH/branches) and after verifying that your branch indeed does contain the changes you have made (look at the commit message), submit a merge request with the “Merge Request” button
    • At this point, the Continuous Integration job will be triggered by the .gitlab-ci.yml file
    • You can follow watch your CI job by following the link on the MR page
  • If the CI job finishes successfully then your MR can be accepted by one of the responsibles for the project. Feel free to notify them on Mattermost/MonoHbb if they are not being responsive

If your development takes more than a few minutes, then it is likely that the branch from which you created your branch (the master branch) will have had changes made on top of it. It is necessary to incorporate these changes into your code before you submit your request or there will be conflicts. As such, during your development, if you know of such changes (perhaps by querying the commit log here then perform a pull to fold in these changes

  • git pull origin master

Dependence Iteration

The XAMPP package depends on code from two main sources

  • Athena Version : The athena release is the main codebase of ATLAS and the version currently being used can be found in the top level CMakeLists.txt file
  • External Packages : These are included as submodules with a title like XAMPPbase@8a9e45a1 which itself is the base version of XAMPP framework.

From time to time, these two packages need to be iterated and updated to a newer version and this requires some care.

Athena Version

It is necessary for the version of athena referenced in the XAMPPmonoH framework to be coherent with that being used by all other packages being used, most notably the XAMPPbase code. Therefore, this change should only occur when necessary and should be done coherently with the base version of the code as it requires the modification of multiple packages to preserve CI functionality. When this happens, the following two places must be changed for **each and every package on which XAMPPmonoH depends which has CI integrated**.:

  • CMakeLists.txt : It is necessary to change the **top level CMakeLists.txt** file which is what the CI job will reference when performing the build
  • .gitlab-ci.yml : It is necessary to change the Docker image with which the CI job is being run as this is what provides the docker image the full athena working environment (as opposed to CVMFS)

External Dependencies

From time to time, new functionality will be added to the code that is external to athena on which XAMPPmonoH depends. This will result in a new revision hash being created for that code (e.g. 8a9e45a162a65cf4b49135dc9010c2bebda32f50 for a particular commit of XAMPPbase). It is precisely these revision hashes which are specified in the submodule names (XAMPPbase@8a9e45a1) and so to include this functionality, we want to “bump up” the versions of these packages. The procedure to do this is rather simple but not straightforward but is well-documented here - look for “Updating Submodules”.

  • Start by cloning the repository (*BUT not recursively this time*)
    • git clone ssh://git@gitlab.cern.ch:7999/atlas-mpp-xampp/XAMPPmonoH.git source
  • Go into the repository and init and update the submodules
    • cd source
    • git submodule init
    • git submodule update
  • Go into the submodule of interest (e.g. AnalysisTools) and checkout the desired revision and pull the changes
    • cd AnalysisTools
    • git checkout ca9f929a9d341173a2e2409eb88c400d1aa3b304
  • Go back to the top level directory and then add, commit, and push the changes
    • cd ..
    • git add AnalysisTools (or whichever submodule you bumped up)
    • git commit -m "Descriptive commit message"
    • git push

And unless you have high level permissions, all of this will have to be done on a branch as in the practice of the standard development.

When everything worked well, go to your working area and update the submodules to get the lates changes (hashes):

  • cd source/
  • git pull
  • git submodule update --recursive

Or you can update all your submodles to the current master, by doing:

  • git submodule foreach git pull origin master

But this is not recommended and you should use always a stable version!!!

Continuous Integration

This framework uses Continuous Integration to ensure that new commits with modifications to the analysis code don’t break its functionality.

For every commit a pipeline (image below) is triggered.

ci screenshot

The pipeline consists of different stages. For each stage a set of jobs is defined in the `.gitlab-ci.yml`` <https://gitlab.cern.ch/atlas-mpp-xampp/XAMPPmonoH/blob/master/.gitlab-ci.yml>`_ file.

Currently the following checks are done within the pipeline:

  1. Is the code nicely formatted? If not, use the automatic formatting tool to enforce a style guide. If this is the case, the current pipeline will fail and a new pipeline will be triggered by the service account.
  2. Does the code compile to a Docker image?
  3. (only master branch) Build Doxygen documentation of the C++ code.
  4. Does XAMPPmonoH run on data ntuples with 0, 1 and 2 lepton selections?
  5. Does XAMPPplotting run on the output XAMPP ntuples of the previous step and produce cutflows?
  6. Do the cutflows produced in the previous step match the reference cutflows?
  7. (only master branch) Build html documentation and upload it to http://xamppmonoh-doxygen.web.cern.ch/
  8. (only master branch) Rename docker image for master branch from master to latest

Docker images and RECAST

This project has a Docker registry: https://gitlab.cern.ch/atlas-mpp-xampp/XAMPPmonoH/container_registry

It is assumed that you already installed Docker and cvmfs on your machine.

Click to toggle the content "How to connect cvmfs modules on your laptop" ``` echo "Connecting CVMS on /cvmfs/..." sudo service autofs stop sudo mkdir -p /cvmfs/atlas.cern.ch sudo mkdir -p /cvmfs/atlas-condb.cern.ch sudo mkdir -p /cvmfs/atlas-nightlies.cern.ch sudo mkdir -p /cvmfs/sft.cern.ch sudo mount -t cvmfs atlas.cern.ch /cvmfs/atlas.cern.ch sudo mount -t cvmfs atlas-condb.cern.ch /cvmfs/atlas-condb.cern.ch sudo mount -t cvmfs atlas-nightlies.cern.ch /cvmfs/atlas-nightlies.cern.ch sudo mount -t cvmfs sft.cern.ch /cvmfs/sft.cern.ch echo "Connections opened." ```

Make sure to have set up cvmfs at this point on your local machine. If this not the case, please remove -v /cvmfs:/cvmfs from the following commands

  • docker pull gitlab-registry.cern.ch/atlas-mpp-xampp/xamppmonoh:latest
  • docker run --rm -it -v $PWD:$PWD -v /cvmfs:/cvmfs -w /xampp/XAMPPmonoH gitlab-registry.cern.ch/atlas-mpp-xampp/xamppmonoh:latest (get the Docker file)
  • source /home/atlas/release_setup.sh
  • source XAMPPmonoH/scripts/prepare_framework.sh

RECAST

The XAMPPmonoH analysis code is RECAST-ready. Click here to learn more about RECAST.

The scripts for running a signal through RECAST are located here:

In order to process a signal with RECAST:

bash XAMPPmonoH/recast/recast_run.sh <path to DxAOD root file of signal> <DSID of signal>

The signal will be processed with a dummy cross-section of 1fb, which needs to be scaled to the correct cross-section in a subsequent step.