Workflow 1917, Stage 1
| Priority | 50 | 
| Processors | 1 | 
| Wall seconds | 80000 | 
| Image | /cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest | 
| RSS bytes | 4194304000 (4000 MiB) | 
| Max distance for inputs | 3000.0 | 
| Enabled input RSEs | 
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC | 
| Enabled output RSEs | 
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC | 
| Enabled sites | 
BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin | 
| Scope | usertests | 
| Events for this stage | 
Output patterns
|   | Destination | Pattern | Lifetime | For next stage | RSE expression | 
|---|
| 1 | https://fndcadoor.fnal.gov:2880/dune/scratch/users/ichong/fnal/01917/1 | *_caf.root |   |   |   | 
Environment variables
| Name | Value | 
|---|
| CODE_TAR_DIR_LOCAL | /cvmfs/fifeuser3.opensciencegrid.org/sw/dune/2bb739fa8328d5424945e0da9e26ee51730d3245 | 
| DUNE_QUALIFIER | e26:prof | 
| DUNE_VERSION | v10_10_00d00 | 
| FCL_FILE | /cvmfs/fifeuser2.opensciencegrid.org/sw/dune/240fabb25ac9c9fd4d5f1b1a3e02a99c0623590d/atm-truth-vtx-reco.fcl | 
| FCL_SECONDARY | cafmaker_atmos_dune10kt_1x2x6_runreco-nuenergy-nuangular_geov5.fcl | 
| NUM_EVENTS | 200 | 
| XML_MASTER | /cvmfs/fifeuser2.opensciencegrid.org/sw/dune/240fabb25ac9c9fd4d5f1b1a3e02a99c0623590d/PandoraSettings_Master_Atmos_DUNEFD_MC.xml | 
| XML_NEUTRINO | /cvmfs/fifeuser2.opensciencegrid.org/sw/dune/240fabb25ac9c9fd4d5f1b1a3e02a99c0623590d/PandoraSettings_Neutrino_Atmos_DUNEFD_MC.xml | 
File states
| Total files | Finding | Unallocated | Allocated | Outputting | Processed | Not found | Failed | 
|---|
|
| 1000 | 0 | 0 | 0 | 0 | 967 | 0 | 33 | 
Job states
| Total | Submitted | Started | Processing | Outputting | Finished | Notused | Aborted | Stalled | Jobscript error | Outputting failed | None processed | 
|---|
| 1738 | 0 | 0 | 0 | 0 | 1294 | 0 | 186 | 132 | 87 | 39 | 0 | 
 
 
RSEs used
| Name | Inputs | Outputs | 
|---|
| PRAGUE | 728 | 0 | 
| RAL_ECHO | 151 | 0 | 
| QMUL | 127 | 0 | 
| RAL-PP | 116 | 0 | 
| SURFSARA | 104 | 0 | 
| NIKHEF | 103 | 0 | 
| DUNE_ES_PIC | 51 | 0 | 
| DUNE_FR_CCIN2P3_DISK | 29 | 0 | 
| None | 0 | 967 | 
Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)
File reset events, by site
| Site | Allocated | Outputting | 
|---|
| CERN | 69 | 1 | 
| UK_RAL-PPD | 37 | 3 | 
| UK_Manchester | 31 | 5 | 
| UK_QMUL | 28 | 1 | 
| CZ_FZU | 28 | 3 | 
| ES_PIC | 27 | 3 | 
| US_FNAL-FermiGrid | 19 | 2 | 
| UK_RAL-Tier1 | 11 | 1 | 
| US_UChicago | 9 | 0 | 
| UK_Lancaster | 9 | 0 | 
| US_FNAL-T1 | 6 | 0 | 
| NL_SURFsara | 6 | 1 | 
| BR_CBPF | 4 | 0 | 
| US_UCSD | 3 | 1 | 
| UK_Oxford | 3 | 0 | 
| NL_NIKHEF | 2 | 0 | 
| UK_Glasgow | 2 | 8 | 
| UK_Bristol | 2 | 0 | 
| UK_Edinburgh | 1 | 0 | 
| UK_Durham | 1 | 0 | 
| US_PuertoRico | 1 | 1 | 
| FR_CCIN2P3 | 0 | 1 | 
Jobscript
#!/bin/bash
:<<'EOF'
This jobscript generates CaloHitList-based graph data 
from input reco2 ROOT files using your custom LArSoft setup.
Required environment variables:
  - FCL_FILE
  - CODE_TAR_DIR_LOCAL
  - DUNE_VERSION
  - DUNE_QUALIFIER
  - XML_MASTER
  - XML_NEUTRINO
  - NUM_EVENTS (optional)
  - FCL_SECONDARY (optional)
EOF
# === Setup FCL and version info ===
DUNE_VERSION=${DUNE_VERSION}
DUNE_QUALIFIER=${DUNE_QUALIFIER}
# === Number of events option ===
if [ -n "$NUM_EVENTS" ]; then
  events_option="-n $NUM_EVENTS"
fi
# === Get a file from justIN ===
did_pfn_rse=$($JUSTIN_PATH/justin-get-file)
if [ -z "$did_pfn_rse" ]; then
  echo "No file assigned. Exiting jobscript."
  exit 0
fi
# === Track input DID for MetaCat ===
echo "$did_pfn_rse" | cut -f1 -d' ' >> all-input-dids.txt
# === Parse PFN from DID ===
pfn=$(echo "$did_pfn_rse" | cut -d' ' -f2)
echo "Input PFN = $pfn"
# === Setup DUNE software ===
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
# === Mirror CODE_TAR_DIR_LOCAL ===
INPUT_TAR_DIR_LOCAL="$CODE_TAR_DIR_LOCAL"
echo "INPUT_TAR_DIR_LOCAL = $INPUT_TAR_DIR_LOCAL"
# === Setup custom code ===
if [ -n "$CODE_TAR_DIR_LOCAL" ]; then
  echo "Using local products from $CODE_TAR_DIR_LOCAL"
  source "$CODE_TAR_DIR_LOCAL/larsoft_truth_vtx_1010/localProducts_larsoft_v10_10_00_e26_prof/setup-grid"
  mrbslp
fi
# === Generate common timestamp and random suffix for output renaming ===
timestamp=$(date -u +"%Y-%m-%dT_%H%M%SZ")
rand_suffix=$((1 + RANDOM % 10))
# === Output file naming ===
fname=$(basename "$pfn" .root)
outFile="${fname}_truthvtx_${timestamp}.root"
logFile="truthvtx_${timestamp}_vtx.log"
# === Set FW search path ===
XML_DIR_MASTER=$(dirname "$XML_MASTER")
XML_DIR_NEUTRINO=$(dirname "$XML_NEUTRINO")
export FW_SEARCH_PATH="$XML_DIR_MASTER:$XML_DIR_NEUTRINO:$FW_SEARCH_PATH"
echo $FW_SEARCH_PATH
# === Run lar (primary) ===
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "Running LArSoft with FCL: $FCL_FILE"
lar -c "$FCL_FILE" $events_option -o "$outFile" -s "$pfn" > "$logFile" 2>&1
larExit=$?
###################run caf ###################
if [ -n "$FCL_SECONDARY" ]; then
  caf_timestamp=$(date -u +"%Y-%m-%dT_%H%M%SZ")
  secondary_out="caf_${fname}_truthvtx_${caf_timestamp}.root"
  secondary_log="caf_${fname}_truthvtx_${caf_timestamp}.log"
  echo "Running CAF step with FCL_SECONDARY: $FCL_SECONDARY using reco output: $outFile"
  lar -c "$FCL_SECONDARY" $events_option -o "$secondary_out" -s "$outFile" > "$secondary_log" 2>&1
  # === Rename CAF output root file to include timestamp if needed ===
  new_caf_out="caf_truthvtx_${caf_timestamp}_caf.root"
  mv caf.root "$new_caf_out"
  echo "Renamed caf.root -> $new_caf_out"
fi
# === Show lar log tail ===
echo '=== Start last 100 lines of lar log file ==='
tail -100 "$logFile"
echo '=== End last 100 lines of lar log file ==='
# === Show lar log tail ===
echo '=== Start last 100 lines of lar log file ==='
tail -100 "$secondary_log"
echo '=== End last  100 lines of lar log file ==='
# === Mark processed ===
if [ $larExit -eq 0 ]; then
  echo "$pfn" > justin-processed-pfns.txt
  jobscriptExit=0
else
  jobscriptExit=1
fi
# === Package logs ===
tar zcf "${JUSTIN_JOBSUB_ID//[@]/_}.logs.tgz" *.log
# === Display output summary ===
echo "=== Generated output files ==="
ls -1 *.* 2>/dev/null | grep -v 'all-input-dids.txt' || echo "No output files found."
exit $jobscriptExit
# === Package logs ===
tar zcf "${JUSTIN_JOBSUB_ID//[@]/_}.logs.tgz" *.log
# === Display output summary ===
echo "=== Generated output files ==="
ls -1 *.* 2>/dev/null | grep -v 'all-input-dids.txt' || echo "No output files found."
exit $jobscriptExit