justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 12034, Stage 1

Workflow12034
Campaign433
Priority50
Processors1
Wall seconds80000
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs0.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/jierans/vd_1x8x14_radiologicals_with_pds_20260118/ntuple/fnal/12034/1triggerAna_*.ntuple.root

Environment variables

NameValue
DUNE_VERSIONv10_14_01d00
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser4.opensciencegrid.org/sw/dune/02ee0029a45e954a9b96a00f1f61b0693e0f0c49
NUM_EVENTS100

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
1000099600400

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
24350000941149400000
Files processed000.20.20.40.40.60.60.80.8111.21.21.41.41.61.61.81.8222.22.22.42.42.62.62.82.833Jan-18 04:00Jan-18 05:00Jan-18 06:00Jan-18 07:00Files processedBin start timesNumber per binCERNUK_QMULUK_RAL-Tier1
Replicas per RSE1000485.29611315979025233.095260567539071000274.7349798402536236.713951135923854377.03938835559586128.741628462668022378.02611438655117128.71850221627251378.5195478530288128.71040763000111378.8485222314375128.706296014678511379.17750785048128.703212268065951379.506501498625128.70115642026671379.8354999642629128.7001284913498Replicas per RSEFNAL_DCACHE (49%)DUNE_US_FNAL_DISK_STAGE (49…DUNE_US_FNAL_DISK_STAGE (49%)DUNE_US_BNL_SDCC (0%)RAL_ECHO (0%)QMUL (0%)DUNE_CERN_EOS (0%)DUNE_UK_GLASGOW (0%)DUNE_UK_LANCASTER_CEPH (0%)DUNE_UK_MANCHESTER_CEPH (0%)

RSEs used

NameInputsOutputs
RAL_ECHO20
DUNE_CERN_EOS10
QMUL10
None04

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
# fcl file and DUNE software version/qualifier to be used
# FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/my_code/fcls/my_reco.fcl}
DUNE_VERSION=${DUNE_VERSION:-v10_14_01d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}
echo "sw version: ${DUNE_VERSION} ${DUNE_QUALIFIER}"
echo "input tarball location:" ${INPUT_TAR_DIR_LOCAL}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ]; then
  events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=$($JUSTIN_PATH/justin-get-file)
did=`echo $did_pfn_rse | cut -f1 -d' '`
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
rse=`echo $did_pfn_rse | cut -f3 -d' '`
echo "Input DID = $did"
echo "Input PFN = $pfn"
echo "Input RSE = $rse"

if [ "$did_pfn_rse" = "" ]; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
export PRODUCTS="$(echo ${INPUT_TAR_DIR_LOCAL}/dunesw-${DUNE_VERSION}/localProducts_larsoft*_prof/):$PRODUCTS"
# Then we can set up our local products
setup duneana "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"

# Construct outFile from input $pfn
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=$(echo $pfn | awk -F/ '{print $NF}')
fname=$(echo $Ffname | awk -F. '{print $1}')
campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

outFile=tpg_${JUSTIN_WORKFLOW_ID}_${now}.root
anaFile=triggerAna_${JUSTIN_WORKFLOW_ID}_${now}.ntuple.root

# Here is where the LArSoft command is call it
(
  # Do the scary preload stuff in a subshell!
  export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
  echo "$LD_PRELOAD"
  # TPG - PDS + Ana
  lar -c triggersim_triggerana_dunefdvd_1x8x14_pds.fcl $events_option -s "$pfn" -o $outFile -T $anaFile 
)

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"

if [ $larExit -eq 0 ]; then
  # Success !
  echo "$pfn" >justin-processed-pfns.txt
  jobscriptExit=0
else
  # Oh :(
  jobscriptExit=1
fi

# Create compressed tar file with all log files
tar zcf $(echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g') *.log
exit $jobscriptExit

justIN time: 2026-02-04 04:27:46 UTC       justIN version: 01.06.00