justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 11846, Stage 1

Workflow11846
Campaign267
Priority50
Processors1
Wall seconds14400
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs0.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1Rucio usertests:fnal-w11846s1p1gen_*.root86400True

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
50100004

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
410000140251010

RSEs used

NameInputsOutputs
MONTECARLO270

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

File reset events, by site

SiteAllocatedOutputting
NL_SURFsara110
US_UCSD31
UK_RAL-Tier120
NL_NIKHEF20
UK_RAL-PPD20
FR_CCIN2P310
CA_Victoria10

Jobscript

#!/bin/bash
#set -euo pipefail

echo "=== GEN job starting ==="
echo "Host: $(hostname)"
echo "PWD(cmd): $(/bin/pwd -P || pwd || true)"
echo "PWD(var): ${PWD:-unset}"
echo "JUSTIN_JOB_INDEX=${JUSTIN_JOB_INDEX:-unset}"
echo "JUSTIN_PATH=${JUSTIN_PATH:-unset}"

# ----------------------------------------------------------------------
# Get the allocated Monte Carlo "counter file" and remember it
# ----------------------------------------------------------------------
if [[ -z "${JUSTIN_PATH:-}" ]]; then
  echo "ERROR: JUSTIN_PATH is not set (job not running under justIN wrapper?)"
  exit 2
fi

alloc="$("$JUSTIN_PATH/justin-get-file")"
if [[ -z "$alloc" ]]; then
  echo "No more inputs allocated (workflow complete). Exiting cleanly."
  exit 0
fi

did="$(echo "$alloc" | awk '{print $1}')"
pfn="$(echo "$alloc" | awk '{print $2}')"
rse="$(echo "$alloc" | awk '{print $3}')"
echo "Allocated input: did=$did pfn=$pfn rse=$rse"

# Use the job index for naming (or derive from pfn if you prefer)
JOB_INDEX="${JUSTIN_JOB_INDEX:-0}"

# ----------------------------------------------------------------------
# Environment setup
# ----------------------------------------------------------------------
echo "Sourcing DUNE setup..."
# setup_dune.sh can reference unset vars, so disable nounset only while sourcing

source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

setup dunesw v10_16_00d00 -q e26:prof

ups active
lar --version

# ----------------------------------------------------------------------
# Run GEN
# ----------------------------------------------------------------------
FCL="prod_muminus_0.1-5.0GeV_isotropic_dune10kt_1x2x6.fcl"
NEVENTS=20
OUTFILE="gen_${JOB_INDEX}.root"

echo "Running lar GEN:"
echo "  FCL: ${FCL}"
echo "  Events: ${NEVENTS}"
echo "  Output: ${OUTFILE}"

lar -c "${FCL}" -n "${NEVENTS}" -o "${OUTFILE}"

# ----------------------------------------------------------------------
# Sanity checks
# ----------------------------------------------------------------------
test -f "${OUTFILE}" || { echo "ERROR: ${OUTFILE} not produced"; exit 1; }
ls -lh "${OUTFILE}"

# ----------------------------------------------------------------------
# Tell justIN we successfully processed the allocated input
# ----------------------------------------------------------------------
echo "$did" >> justin-processed-dids.txt
# (Alternative: echo "$pfn" >> justin-processed-pfns.txt)

echo "Wrote processed DID to justin-processed-dids.txt: $did"
echo "=== GEN job completed successfully ==="
justIN time: 2026-02-06 03:32:03 UTC       justIN version: 01.06.00