Workflow 11839, Stage 2
| Workflow | 11839 |
| Campaign | 262 |
| Priority | 50 |
| Processors | 1 |
| Wall seconds | 28800 |
| Image | /cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest |
| RSS bytes | 8388608000 (8000 MiB) |
| Max distance for inputs | 0.0 |
| Enabled input RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled output RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled sites |
BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin |
| Scope | usertests |
| Events for this stage |
Output patterns
| | Destination | Pattern | Lifetime | For next stage | RSE expression |
|---|
| 1 | Rucio usertests:fnal-w11839s2p1 | g4_*.root | 86400 | True | |
File states
| Total files | Finding | Unallocated | Allocated | Outputting | Processed | Not found | Failed |
|---|
|
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Job states
| Total | Submitted | Started | Processing | Outputting | Finished | Notused | Aborted | Stalled | Jobscript error | Outputting failed | None processed |
|---|
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)
Jobscript
#!/bin/bash
#set -euo pipefail
echo "=== G4 job starting ==="
echo "Host: $(hostname)"
echo "PWD: $(pwd)"
# ----------------------------------------------------------------------
# Sanity: justIN-provided input
# ----------------------------------------------------------------------
if [[ -z "${JUSTIN_INPUT_FILE:-}" ]]; then
echo "ERROR: JUSTIN_INPUT_FILE is not set"
exit 1
fi
INPUT_GEN="${JUSTIN_INPUT_FILE}"
echo "GEN input file: ${INPUT_GEN}"
# ----------------------------------------------------------------------
# Environment setup
# ----------------------------------------------------------------------
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
setup dunesw v10_16_00d00 -q e26:prof
ups active
lar --version
# ----------------------------------------------------------------------
# Run G4
# ----------------------------------------------------------------------
FCL=standard_g4_dune10kt_1x2x6.fcl
# Output filename derived from input (important!)
BASE=$(basename "${INPUT_GEN}" .root)
OUTFILE="g4_${BASE}.root"
echo "Running lar G4:"
echo " FCL: ${FCL}"
echo " Input: ${INPUT_GEN}"
echo " Output: ${OUTFILE}"
lar \
-c "${FCL}" \
-s "${INPUT_GEN}" \
-o "${OUTFILE}"
# ----------------------------------------------------------------------
# Sanity checks
# ----------------------------------------------------------------------
if [[ ! -f "${OUTFILE}" ]]; then
echo "ERROR: Output file ${OUTFILE} not produced!"
exit 1
fi
ls -lh "${OUTFILE}"
echo "=== G4 job completed successfully ==="