justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 203, Stage 1

Priority50
Processors1
Wall seconds3600
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/lwhite86/fnal/00203/1trainingFile*.root

Environment variables

NameValue
INPUT_CODE/cvmfs/fifeuser4.opensciencegrid.org/sw/dune/312db13e9faabde6e04103f62be51522e0a67928
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser2.opensciencegrid.org/sw/dune/92d206ee2f0bcfa890f8c4180a10185439a23475

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
1000000099109

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
2927000028610106500
Files processed00100100200200300300400400500500600600Aug-01 09:00Aug-01 10:00Aug-01 11:00Aug-01 12:00Files processedBin start timesNumber per binUS_PuertoRicoNL_SURFsaraUK_QMULUK_ManchesterUK_RAL-Tier1UK_LancasterNL_NIKHEFUK_RAL-PPDUS_UChicagoUS_WisconsinCZ_FZUUS_FNAL-FermiG…US_FNAL-FermiGridUS_UCSDUS_FNAL-T1
Replicas per RSE1000479.3289605299767207.71747089233391000290.45487171047137269.2930193873679674329.5279745930606136.5062917429009452345.7349214224978128.941896089877129356.75971019292183125.6469491805032120363.58611184253004124.2734163615713919369.0772063223675123.5198859608778412373.4664029451466123.137443563225187376.16329131970826122.998207520947297378.1523514211458122.941972872728573379.57359684943685122.92590390486629Replicas per RSEDUNE_US_FNAL_DISK_S…DUNE_US_FNAL_DISK_STAGE (44%)FNAL_DCACHE (44%)RAL-PP (3%)RAL_ECHO (2%)QMUL (1%)NIKHEF (0%)PRAGUE (0%)DUNE_ES_PIC (0%)DUNE_UK_LANCASTER_C…DUNE_UK_LANCASTER_CEPH (0%)SURFSARA (0%)DUNE_FR_CCIN2P3_DIS…DUNE_FR_CCIN2P3_DISK (0%)

RSEs used

NameInputsOutputs
DUNE_US_FNAL_DISK_STAGE7810
RAL-PP740
DUNE_ES_PIC590
RAL_ECHO520
QMUL290
PRAGUE240
NIKHEF200
DUNE_UK_LANCASTER_CEPH70
SURFSARA70
DUNE_FR_CCIN2P3_DISK30
None0991

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
:<<'EOF'

To use this jobscript to process 10 files from the dc4-vd-coldbox-bottom
data and put the output in the usertests namespace (MetaCat) and 
scope (Rucio), and in the usertests:output-test-01 dataset in MetaCat and
Rucio, use this command to create the workflow:

justin simple-workflow \
--mql \
"files from dune:all where core.run_type='dc4-vd-coldbox-bottom' and dune.campaign='dc4' limit 10" \
--jobscript dc4-vd-coldbox-bottom.jobscript --max-distance 30 --rss-mb 4000 \
--scope usertests --output-pattern '*_reco_data_*.root:output-test-01' 

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF

# fcl file and DUNE software version/qualifier to be used
FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/runPandora.fcl}
DUNE_VERSION=${DUNE_VERSION:-v10_02_02d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}
FW_SEARCH_PATH=$FW_SEARCH_PATH:$INPUT_TAR_DIR_LOCAL
echo $FW_SEARCH_PATH

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

# the xroot lib for streaming non-root files is in testproducts, 
# so add it to the start of the path
#cp -r $INPUT_CODE/* .
#ls -lhrt 
#ls $INPUT_CODE
#echo $PRODUCTS
#PRODUCTS=$INPUT_CODE:${PRODUCTS}
cp -r $INPUT_CODE/larpandoracontent .
PRODUCTS=`pwd`:$PRODUCTS
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
echo $PRODUCTS
ups active
echo "LArPandoraContent: ${LARPANDORACONTENT_DIR}"
export OMP_NUM_THREADS=${JUSTIN_PROCESSORS} 

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
#outFile=${fname}_reco_data_${now}.root

campaign="justIN.r${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"

#lar -c $FCL_FILE $events_option -o $outFile "$pfn" > ${fname}_reco_${now}.log 2>&1
lar -c $FCL_FILE $events_option "$pfn" 
#> ${fname}_training_${now}.log 2>&1
)

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"
echo "$pfn" > justin-processed-pfns.txt
mv eventClassificationTraining.root trainingFile_${fname}.root

# LEIGH: No ART files so the below isn't needed?
#jobscriptExit=1
#if [ $larExit -eq 0 ] ; then
#  # write metadata file if lar succeeded
#  extractor_prod.py --infile "$outFile" --no_crc --appname reco \
#    --appversion ${DUNE_VERSION} --appfamily art \
#    --campaign ${campaign} > $outFile.ext.json  
#  extractorExit=$?
#  echo "extractor_prod.py exit code $extractorExit"
#
#  # Run pdjson2meta. THIS SHOULD MOVE TO SOMEWHERE LIKE duneutil ?
#  /cvmfs/dune.opensciencegrid.org/products/dune/justin/pro/NULL/jobutils/pdjson2metadata \
#     $outFile.ext.json all-input-dids.txt > $outFile.json
#  p2mExit=$?
#  echo "pdjson2metadata exit code $p2mExit"
#
#  if [ $extractorExit -eq 0 -a $p2mExit -eq 0 ] ; then
#    echo "Metadata extraction succeeds"
#    echo "$pfn" > justin-processed-pfns.txt
#    echo "===Metadata JSON==="
#    cat $outFile.json
#    echo
#    echo "==================="
#    jobscriptExit=0
#  fi
#fi

ls -l

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
#exit $jobscriptExit
exit $larExit

justIN time: 2025-08-04 14:07:54 UTC       justIN version: 01.04.00