Workflow 11986, Stage 1
| Workflow | 11986 |
| Campaign | 384 |
| Priority | 50 |
| Processors | 1 |
| Wall seconds | 14400 |
| Image | /cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest |
| RSS bytes | 4194304000 (4000 MiB) |
| Max distance for inputs | 0.0 |
| Enabled input RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled output RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled sites |
BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin |
| Scope | usertests |
| Events for this stage |
Output patterns
| | Destination | Pattern | Lifetime | For next stage | RSE expression |
|---|
| 1 | Rucio usertests:fnal-w11986s1p1 | *_gen.root | 86400 | True | |
Environment variables
| Name | Value |
|---|
| JOB_FHICL_FILE | mpvmpr_gen_1x2x6.fcl |
| NEVENTS | 20 |
File states
| Total files | Finding | Unallocated | Allocated | Outputting | Processed | Not found | Failed |
|---|
|
| 50 | 0 | 0 | 0 | 0 | 50 | 0 | 0 |
Job states
| Total | Submitted | Started | Processing | Outputting | Finished | Notused | Aborted | Stalled | Jobscript error | Outputting failed | None processed |
|---|
| 75 | 0 | 0 | 0 | 0 | 75 | 0 | 0 | 0 | 0 | 0 | 0 |
RSEs used
| Name | Inputs | Outputs |
|---|
| MONTECARLO | 50 | 0 |
| NIKHEF | 0 | 19 |
| DUNE_ES_PIC | 0 | 14 |
| DUNE_FR_CCIN2P3_DISK | 0 | 10 |
| QMUL | 0 | 3 |
| DUNE_UK_MANCHESTER_CEPH | 0 | 2 |
| DUNE_US_BNL_SDCC | 0 | 1 |
| RAL_ECHO | 0 | 1 |
Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)
Jobscript
#!/bin/bash
#set -euo pipefail
echo "=== GEN job starting ==="
echo "Host: $(hostname)"
echo "PWD(cmd): $(/bin/pwd -P || pwd || true)"
echo "PWD(var): ${PWD:-unset}"
echo "JUSTIN_JOB_INDEX=${JUSTIN_JOB_INDEX:-unset}"
echo "JUSTIN_PATH=${JUSTIN_PATH:-unset}"
# ----------------------------------------------------------------------
# Get the allocated Monte Carlo "counter file" and remember it
# ----------------------------------------------------------------------
if [[ -z "${JUSTIN_PATH:-}" ]]; then
echo "ERROR: JUSTIN_PATH is not set (job not running under justIN wrapper?)"
exit 2
fi
alloc="$("$JUSTIN_PATH/justin-get-file")"
if [[ -z "$alloc" ]]; then
echo "No more inputs allocated (workflow complete). Exiting cleanly."
exit 0
fi
did="$(echo "$alloc" | awk '{print $1}')"
pfn="$(echo "$alloc" | awk '{print $2}')"
rse="$(echo "$alloc" | awk '{print $3}')"
echo "Allocated input: did=$did pfn=$pfn rse=$rse"
# Output value of expected input environment variables
echo "# events: ${NEVENTS}"
echo "fhicl file: ${JOB_FHICL_FILE}"
# Work in the sandbox
cd $PWD
# ----------------------------------------------------------------------
# Grab the tarball that has our fhicl files we'll need
# ----------------------------------------------------------------------
FCL_TGZ_URL="https://raw.githubusercontent.com/SFBayLaser/dune-justin/main/bundles/fhicl_bundle.tgz"
curl -L -o fhicl_bundle.tgz "$FCL_TGZ_URL"
tar xzf fhicl_bundle.tgz
# ----------------------------------------------------------------------
# Environment setup
# ----------------------------------------------------------------------
echo "Sourcing DUNE setup..."
# setup_dune.sh can reference unset vars, so disable nounset only while sourcing
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
setup dunesw v10_16_00d00 -q e26:prof
ups active
lar --version
# ----------------------------------------------------------------------
# Point fhicl to our local files
# ----------------------------------------------------------------------
export FHICL_FILE_PATH="$PWD/fhicl:${FHICL_FILE_PATH:-.}"
# ----------------------------------------------------------------------
# Create the starting run/subrun/event number
# ----------------------------------------------------------------------
did="$(printf '%s' "$did" | tr -d '\r' | xargs)"
jobnum_str="${did##*-}" # everything after the last '-': 000028
jobnum=$((10#$jobnum_str)) # force decimal
RUN_BASE=200000
RUN=$((RUN_BASE + jobnum))
SUBRUN=0
EVENT=1
# ----------------------------------------------------------------------
# Run GEN
# ----------------------------------------------------------------------
FCL=${JOB_FHICL_FILE}
#Can we recover first portion for name?
genType=$(echo "$FCL" | awk -F"_" '{print $1}')
echo "GEN stage type: $genType"
OUTFILE="${genType}_${did}_gen.root"
echo "Running lar GEN:"
echo " FCL: ${FCL}"
echo " Events: ${NEVENTS}"
echo " Output: ${OUTFILE}"
lar -c "${FCL}" -n "${NEVENTS}" -e "${RUN}:${SUBRUN}:${EVENT}" -o "${OUTFILE}"
# ----------------------------------------------------------------------
# Sanity checks
# ----------------------------------------------------------------------
test -f "${OUTFILE}" || { echo "ERROR: ${OUTFILE} not produced"; exit 1; }
ls -lh "${OUTFILE}"
# ----------------------------------------------------------------------
# Tell justIN we successfully processed the allocated input
# ----------------------------------------------------------------------
echo "$did" >> justin-processed-dids.txt
# (Alternative: echo "$pfn" >> justin-processed-pfns.txt)
echo "Input fcl file: ${JOB_FHICL_FILE}"
echo "genType: ${genType}"
echo "Wrote processed DID to justin-processed-dids.txt: $did"
echo "=== GEN job completed successfully ==="