justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 9880, Stage 1

Workflow9880
Priority50
Processors1
Wall seconds14600
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/dmunoz/fnal/09880/1*.root
2https://fndcadoor.fnal.gov:2880/dune/scratch/users/dmunoz/fnal/09880/1*.logs.tgz

Environment variables

NameValue
fcl_dir/cvmfs/fifeuser4.opensciencegrid.org/sw/dune/441105b7476dff4b40bea289993f7b86e1e284d9

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
1000001000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
15000014001000
Files processed001122334455667788991010Nov-07 02:00Nov-07 03:00Nov-07 04:00Files processedBin start timesNumber per binNL_SURFsaraUK_QMULUK_ManchesterFR_CCIN2P3ES_PIC
Replicas per RSE10471.1924750185014181.3510380.0339.33277.0010576427303255.89310104311012279.8537488341203201.460510492318061294.810510492318172.106212933602681309.5405471504122155.746849877230431327.34999999999997142.807524981498661347.460510492318133.853748834120321368.9931528179161129.27684441772084Replicas per RSEFNAL_DCACHE (33%)DUNE_ES_PIC (33%)SURFSARA (10%)DUNE_UK_GLASGOW (6%)DUNE_FR_CCIN2P3_DISK (3%)NIKHEF (3%)RAL_ECHO (3%)RAL-PP (3%)DUNE_CERN_EOS (3%)

RSEs used

NameInputsOutputs
DUNE_ES_PIC30
DUNE_UK_GLASGOW20
SURFSARA20
RAL_ECHO10
RAL-PP10
DUNE_CERN_EOS10
None020

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
if [ "$pfn" = "" ] || [ "$pfn" = "000001" ]; then
  echo "No valid input file found. Exiting."
  exit 0
fi

# Setup DUNE environment

DUNE_VERSION=${DUNE_VERSION:-v09_91_03d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}

source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
# export PRODUCTS="${INPUT_TAR_DIR_LOCAL}/localProducts_larsoft_v09_91_03d00_e26_prof/:$PRODUCTS" #working
#setup protoduneana "$DUNE_VERSION" -q "$DUNE_QUALIFIER" # working
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
setup justin
setup metacat

# mrbslp

FCL_GEN=${FCL_GEN:-$fcl_dir/caloinfo.fcl}   #for changing fcl file for generating
echo "Path to fcl: "
echo $FCL_GEN

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

outFile_GEN=${fname}_calo_${now}.root

# Here is where the LArSoft command is call it 
(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"


lar -c $FCL_GEN -n -1 -o temp.root -s $pfn > ${fname}_calo_${now}.log 2>&1

if [ -f "temp.root" ]; then
    echo "Found temp.root - renaming..."
    mv temp.root "$outFile_GEN"
elif [ -f "caloinfo_output_b.root" ]; then
    echo "Found caloinfo_output_b.root - renaming..."
    mv caloinfo_output_b.root "$outFile_GEN"
else
    echo "No matching root file found!"
fi

# )

echo '=== Start last 50 lines of lar log file ==='
tail -50 ${fname}_calo_${now}.log
echo '=== End last 50 lines of lar log file ==='

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"

if [ $larExit -eq 0 ] ; then
  # Success !
  echo "$pfn" > justin-processed-pfns.txt
  jobscriptExit=0
else
  # Oh :(
  jobscriptExit=1
fi

# Create compressed tar file with all log files 
# tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
tar zcf "$(echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g')" *.log *-pnfs.txt
exit $jobscriptExit

)
justIN time: 2026-02-10 09:51:08 UTC       justIN version: 01.06.00