justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 1035, Stage 1

Priority50
Processors1
Wall seconds80000
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes2097152000 (2000 MiB)
Max distance for inputs0.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/ayankele/fnal/01035/1*eco*.root

Environment variables

NameValue
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser4.opensciencegrid.org/sw/dune/7610af3dffce908cd92377912a9d2dbbc399c94f

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
10000000100000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
15360000140800128000
Files processed0010010020020030030040040050050060060070070080080090090010001000Aug-16 00:00Aug-16 01:00Aug-16 02:00Files processedBin start timesNumber per binNL_SURFsaraUK_QMULUK_RAL-Tier1UK_RAL-PPDES_PICCZ_FZU
Replicas per RSE1000478.6476068244518197.166052779964471000329.63451257143936326.4738215717481123279.1155643966198203.82682229218415110291.5386686069735176.88010111902207109309.6450608223833155.65287156944515103331.8933496925075140.33122080334007102356.43509602952275131.3706411358709531373.23258759113344128.9176887897476711378.5945157521613128.70938021823085Replicas per RSEFNAL_DCACHE (38%)PRAGUE (38%)NIKHEF (4%)RAL_ECHO (4%)SURFSARA (4%)RAL-PP (3%)QMUL (3%)DUNE_FR_CCIN2P3_DISK (1%)DUNE_ES_PIC (0%)

RSEs used

NameInputsOutputs
PRAGUE5660
RAL_ECHO1100
SURFSARA1090
RAL-PP1030
QMUL1010
DUNE_ES_PIC110
None0977

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
:<<'EOF'

To use this jobscript to process 5 files from the dataset fardet-hd__fd_mc_2023a_reco2__full-reconstructed__v09_81_00d02__standard_reco2_dune10kt_nu_1x2x6__prodgenie_nu_dune10kt_1x2x6__out1__validation
data and put the output in the $USER namespace (MetaCat) and saves the output in /scratch
Use this command to create the workflow:

justin simple-workflow \
--mql \
"files from fardet-hd:fardet-hd__fd_mc_2023a__hit-reconstructed__v09_78_01d01__standard_reco1_dune10kt_1x2x6__prodgenie_nu_dune10kt_1x2x6__out1__v1_official limit 5  ordered"\
--jobscript submit_ana.jobscript --rss-mb 4000 \
--scope higuera --output-pattern '*_myreco2_*.root:$FNALURL/$USERF' 

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF

# fcl file and DUNE software version/qualifier to be used
FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/recoenergys.fcl}
DUNE_VERSION=${DUNE_VERSION:-v10_03_01d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
#did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

#if [ "$did_pfn_rse" = "" ] ; then
#  echo "Nothing to process - exit jobscript"
#  exit 0
#fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
#echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
#pfn=`echo $did_pfn_rse | cut -f2 -d' '`
#echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
export PRODUCTS="${INPUT_TAR_DIR_LOCAL}/localProducts_larsoft_v10_03_01_e26_prof/:$PRODUCTS"
# Then we can set up our local products
setup duneana "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"

campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

for nf in {1..15}; do  
    DID_PFN_RSE=$($JUSTIN_PATH/justin-get-file)

    # Check that any file was returned
    if [ "${DID_PFN_RSE}" == "" ]; then
        echo "Could not get file"
        # exit 0
        continue
    fi

    pfn=$(echo ${DID_PFN_RSE} | cut -f2 -d' ')
    DID=$(echo ${DID_PFN_RSE} | cut -f1 -d' ')
    
    echo ${DID} >> did.list
    echo ${pfn} >> file.list

    # Construct outFile from input $pfn 
    now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
    Ffname=`echo $pfn | awk -F/ '{print $NF}'`
    fname=`echo $Ffname | awk -F. '{print $1}'`
    outFile=${fname}_reco2_${now}.root

    # Here is where the LArSoft command is call it 
    (
    # Do the scary preload stuff in a subshell!
    export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
    echo "$LD_PRELOAD"

    source ${INPUT_TAR_DIR_LOCAL}/localProducts_larsoft_v10_03_01_e26_prof/setup-grid
    mrbslp
    ln -s ${INPUT_TAR_DIR_LOCAL}/build_slf7.x86_64/dunereco/fcl/* .
    export CET_PLUGIN_PATH=${INPUT_TAR_DIR_LOCAL}/build_slf7.x86_64/dunereco/slf7.x86_64.e26.prof/lib:${CET_PLUGIN_PATH}
    
    lar -c $FCL_FILE $events_option -s ${pfn} > ${fname}_ana_${now}.log 2>&1
    )
 
    echo '=== Start last 20 lines of lar log file ==='
    tail -20 ${fname}_ana_${now}.log
    echo '=== End last 20 lines of lar log file ==='

    # Subshell exits with exit code of last command
    larExit=$?
    echo "lar exit code $larExit"

    if [ $larExit -eq 0 ] ; then
      # Success !
      echo "$pfn" &>> justin-processed-pfns.txt
      jobscriptExit=0
    fi
done

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2025-08-16 14:28:47 UTC       justIN version: 01.04.01