justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 12672, Stage 2

Workflow12672
Campaign1009
Priority50
Processors1
Wall seconds7200
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs60.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1Rucio usertests:fnal-w12672s2p1*_g4.root86400True

Environment variables

NameValue
JOB_FHICL_FILEstandard_g4_dune10kt_1x2x6.fcl

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
2000002000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
35000031004000
Files processed0022446688101012121414161618182020Feb-03 09:00Feb-03 10:00Feb-03 11:00Files processedBin start timesNumber per binUK_QMULUK_ManchesterUK_RAL-Tier1NL_NIKHEFUK_RAL-PPDES_PICUK_Imperial
Replicas per RSE14469.0120948061036309.17107238347936290.9879051938964179.82892761652067Replicas per RSEDUNE_UK_MANCHESTER_CEPH (70%)RAL-PP (30%)

RSEs used

NameInputsOutputs
DUNE_UK_MANCHESTER_CEPH1710
RAL-PP64
NIKHEF04
DUNE_ES_PIC01
QMUL01

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

File reset events, by site

SiteAllocatedOutputting
ES_PIC20
UK_QMUL10

Jobscript

#!/bin/bash
#set -euo pipefail

echo "=== G4 job starting ==="
echo "Host: $(hostname)"
echo "PWD(cmd): $(/bin/pwd -P || pwd || true)"
echo "PWD(var): ${PWD:-unset}"
echo "JUSTIN_PATH=${JUSTIN_PATH:-unset}"
echo "JUSTIN_INPUT_FILE=${JUSTIN_INPUT_FILE:-unset}"

# ----------------------------------------------------------------------
# Allocate the input file via justIN (preferred, robust)
# ----------------------------------------------------------------------
if [[ -z "${JUSTIN_PATH:-}" ]]; then
  echo "ERROR: JUSTIN_PATH is not set (not running under justIN wrapper?)"
  exit 2
fi

alloc="$("$JUSTIN_PATH/justin-get-file")"
if [[ -z "$alloc" ]]; then
  echo "No more inputs allocated for this stage; exiting cleanly."
  exit 0
fi

did="$(echo "$alloc" | awk '{print $1}')"
pfn="$(echo "$alloc" | awk '{print $2}')"
rse="$(echo "$alloc" | awk '{print $3}')"
echo "Allocated input: did=$did pfn=$pfn rse=$rse"

# Prefer the PFN from allocation; fall back to JUSTIN_INPUT_FILE if needed
INPUT_GEN="${pfn:-${JUSTIN_INPUT_FILE:-}}"
if [[ -z "${INPUT_GEN}" ]]; then
  echo "ERROR: no input file path available (pfn empty and JUSTIN_INPUT_FILE unset)"
  exit 3
fi

echo "GEN input file: ${INPUT_GEN}"

# Output value of expected input environment variables
echo "fhicl file: ${JOB_FHICL_FILE}"

# Work in the sandbox
cd $PWD

# ----------------------------------------------------------------------
# Grab the tarball that has our fhicl files we'll need
# ----------------------------------------------------------------------
FCL_TGZ_URL="https://raw.githubusercontent.com/SFBayLaser/dune-justin/main/bundles/fhicl_bundle.tgz"
curl -L -o fhicl_bundle.tgz "$FCL_TGZ_URL"
tar xzf fhicl_bundle.tgz

# ----------------------------------------------------------------------
# Environment setup
# ----------------------------------------------------------------------
echo "Sourcing DUNE setup..."
#set +u
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
#set -u

setup dunesw v10_16_00d00 -q e26:prof

ups active
lar --version

# ----------------------------------------------------------------------
# Point fhicl to our local files
# ----------------------------------------------------------------------
export FHICL_FILE_PATH="$PWD/fhicl:${FHICL_FILE_PATH:-.}"

# ----------------------------------------------------------------------
# Run G4
# ----------------------------------------------------------------------
FCL=${JOB_FHICL_FILE}

# Output filename derived from input
BASE=$(basename "${INPUT_GEN}" .root)
OUTFILE="${BASE}_g4.root"

echo "Running lar G4:"
echo "  FCL:     ${FCL}"
echo "  Input:   ${INPUT_GEN}"
echo "  Output:  ${OUTFILE}"

lar -c "${FCL}" -s "${INPUT_GEN}" -o "${OUTFILE}" -n -1

# ----------------------------------------------------------------------
# Sanity checks
# ----------------------------------------------------------------------
test -f "${OUTFILE}" || { echo "ERROR: Output file ${OUTFILE} not produced!"; exit 1; }
ls -lh "${OUTFILE}"

# ----------------------------------------------------------------------
# Tell justIN we successfully processed the allocated input
# ----------------------------------------------------------------------
echo "$did" >> justin-processed-dids.txt
echo "Wrote processed DID: $did"

echo "=== G4 job completed successfully ==="
justIN time: 2026-02-04 04:26:08 UTC       justIN version: 01.06.00