Workflow 11935, Stage 2
| Workflow | 11935 |
| Campaign | 336 |
| Priority | 50 |
| Processors | 1 |
| Wall seconds | 28800 |
| Image | /cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest |
| RSS bytes | 8388608000 (8000 MiB) |
| Max distance for inputs | 0.0 |
| Enabled input RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled output RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
| Enabled sites |
BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin |
| Scope | usertests |
| Events for this stage |
Output patterns
| | Destination | Pattern | Lifetime | For next stage | RSE expression |
|---|
| 1 | Rucio usertests:fnal-w11935s2p1 | *_g4.root | 86400 | True | |
Environment variables
| Name | Value |
|---|
| JOB_FHICL_FILE | standard_g4_dune10kt_1x2x6.fcl |
File states
| Total files | Finding | Unallocated | Allocated | Outputting | Processed | Not found | Failed |
|---|
|
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Job states
| Total | Submitted | Started | Processing | Outputting | Finished | Notused | Aborted | Stalled | Jobscript error | Outputting failed | None processed |
|---|
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)
Jobscript
#!/bin/bash
#set -euo pipefail
echo "=== G4 job starting ==="
echo "Host: $(hostname)"
echo "PWD(cmd): $(/bin/pwd -P || pwd || true)"
echo "PWD(var): ${PWD:-unset}"
echo "JUSTIN_PATH=${JUSTIN_PATH:-unset}"
echo "JUSTIN_INPUT_FILE=${JUSTIN_INPUT_FILE:-unset}"
# ----------------------------------------------------------------------
# Allocate the input file via justIN (preferred, robust)
# ----------------------------------------------------------------------
if [[ -z "${JUSTIN_PATH:-}" ]]; then
echo "ERROR: JUSTIN_PATH is not set (not running under justIN wrapper?)"
exit 2
fi
alloc="$("$JUSTIN_PATH/justin-get-file")"
if [[ -z "$alloc" ]]; then
echo "No more inputs allocated for this stage; exiting cleanly."
exit 0
fi
did="$(echo "$alloc" | awk '{print $1}')"
pfn="$(echo "$alloc" | awk '{print $2}')"
rse="$(echo "$alloc" | awk '{print $3}')"
echo "Allocated input: did=$did pfn=$pfn rse=$rse"
# Prefer the PFN from allocation; fall back to JUSTIN_INPUT_FILE if needed
INPUT_GEN="${pfn:-${JUSTIN_INPUT_FILE:-}}"
if [[ -z "${INPUT_GEN}" ]]; then
echo "ERROR: no input file path available (pfn empty and JUSTIN_INPUT_FILE unset)"
exit 3
fi
echo "GEN input file: ${INPUT_GEN}"
# Output value of expected input environment variables
echo "fhicl file: ${JOB_FHICL_FILE}"
# Work in the sandbox
cd $PWD
# ----------------------------------------------------------------------
# Grab the tarball that has our fhicl files we'll need
# ----------------------------------------------------------------------
FCL_TGZ_URL="https://raw.githubusercontent.com/SFBayLaser/dune-justin/main/bundles/fhicl_bundle.tgz"
curl -L -o fhicl_bundle.tgz "$FCL_TGZ_URL"
tar xzf fhicl_bundle.tgz
# ----------------------------------------------------------------------
# Environment setup
# ----------------------------------------------------------------------
echo "Sourcing DUNE setup..."
#set +u
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
#set -u
setup dunesw v10_16_00d00 -q e26:prof
ups active
lar --version
# ----------------------------------------------------------------------
# Point fhicl to our local files
# ----------------------------------------------------------------------
export FHICL_FILE_PATH="$PWD/fhicl:${FHICL_FILE_PATH:-.}"
# ----------------------------------------------------------------------
# Run G4
# ----------------------------------------------------------------------
FCL=${JOB_FHICL_FILE}
# Output filename derived from input
BASE=$(basename "${INPUT_GEN}" .root)
OUTFILE="${BASE}_g4.root"
echo "Running lar G4:"
echo " FCL: ${FCL}"
echo " Input: ${INPUT_GEN}"
echo " Output: ${OUTFILE}"
lar -c "${FCL}" -s "${INPUT_GEN}" -o "${OUTFILE}" -n -1
# ----------------------------------------------------------------------
# Sanity checks
# ----------------------------------------------------------------------
test -f "${OUTFILE}" || { echo "ERROR: Output file ${OUTFILE} not produced!"; exit 1; }
ls -lh "${OUTFILE}"
# ----------------------------------------------------------------------
# Tell justIN we successfully processed the allocated input
# ----------------------------------------------------------------------
echo "$did" >> justin-processed-dids.txt
echo "Wrote processed DID: $did"
echo "=== G4 job completed successfully ==="