justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 12137, Stage 1

Workflow12137
Campaign531
Priority50
Processors1
Wall seconds80000
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/galli/my_atm_prod/justin_jobs/fnal/12137/1*ana*.root

Environment variables

NameValue
CODE_TAR_DIR_LOCAL/cvmfs/fifeuser3.opensciencegrid.org/sw/dune/f2aa0933d6b0302a87f8b92969147394234f3bf2
DUNE_QUALIFIERe26:prof
DUNE_VERSIONv09_91_04d00
FCL_FILE/cvmfs/fifeuser4.opensciencegrid.org/sw/dune/dbbb86d6cbf5ec721b2c78b37b41564b2cfad69b/scripts/run_MyAnalysis.fcl
HAS_ART_OUTPUTfalse

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
100008220017701

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
19200001774001100
Files processed00101020203030404050506060707080809090100100110110120120130130140140150150Jan-20 19:00Jan-20 20:00Jan-20 21:00Jan-20 22:00Jan-20 23:00Jan-21 00:00Jan-21 01:00Jan-21 02:00Jan-21 03:00Jan-21 04:00Jan-21 05:00Jan-21 06:00Jan-21 07:00Files processedBin start timesNumber per binNL_SURFsaraUK_OxfordUK_SheffieldUK_RAL-Tier1UK_LancasterNL_NIKHEFUK_RAL-PPDUK_ImperialCZ_FZUCERNIT_CNAFUK_QMULUK_BrunelES_PIC
Replicas per RSE1000461.6734212372731167.53540594121597999428.5656926847354327.4315979423013999278.7002213397523262.7479537251754117300.4307005694035165.03032124096802110315.8449048526328150.5002169665558102332.59918539575665139.9720638594350899350.04015953646143133.0520036880408690367.3181946280218129.466456041578423377.85021344108094128.7219471221514Replicas per RSEPRAGUE (28%)FNAL_DCACHE (28%)DUNE_ES_PIC (28%)NIKHEF (3%)SURFSARA (3%)RAL_ECHO (2%)QMUL (2%)RAL-PP (2%)DUNE_FR_CCIN2P3_DISK (0%)

RSEs used

NameInputsOutputs
PRAGUE760
SURFSARA220
RAL_ECHO210
RAL-PP210
NIKHEF200
QMUL200
DUNE_ES_PIC60
DUNE_FR_CCIN2P3_DISK20
None0177

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
:<<'EOF'

To use this jobscript to process 10 files from the dc4-vd-coldbox-bottom
data and put the output in the usertests namespace (MetaCat) and 
scope (Rucio), and in the usertests:output-test-01 dataset in MetaCat and
Rucio, use this command to create the workflow:

justin simple-workflow \
--mql "files from justin-tutorial:justin-tutorial-2024 limit 10" \
--jobscript dc4-vd-coldbox-bottom.jobscript --max-distance 30 --rss-mb 4000 \
--scope usertests --output-pattern '*_reco_data_*.root:output-test-01' \
--lifetime-days 1

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF

# fcl file and DUNE software version/qualifier to be used
FCL_FILE=${FCL_FILE:-run_MyAnalysis.fcl}
DUNE_VERSION=${DUNE_VERSION:-v09_91_04d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}
HAS_ART_OUTPUT=${HAS_ART_OUTPUT:true}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

# the xroot lib for streaming non-root files is in testproducts, 
# so add it to the start of the path
export PRODUCTS=/cvmfs/dune.opensciencegrid.org/products/dune/testproducts:${PRODUCTS}
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
export INPUT_TAR_DIR_LOCAL=${CODE_TAR_DIR_LOCAL}

if [ ! -z "$CODE_TAR_DIR_LOCAL" ]; then
	echo "Using custom sources from $CODE_TAR_DIR_LOCAL"
	source ${CODE_TAR_DIR_LOCAL}/*/localProducts*/setup-grid
	mrbslp
fi

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
outFile=${fname}_reco_data_${now}.root

campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"
if [ "$HAS_ART_OUTPUT" = true ];then
  OUTPUT_CMD="-o $outFile"
else
  OUTPUT_CMD="-T ${fname}_ana_${now}.root"
fi

(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"

lar -c $FCL_FILE $events_option $OUTPUT_CMD "$pfn" > ${fname}_reco_${now}.log 2>&1
)

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"

echo '=== Start last 100 lines of lar log file ==='
tail -100 ${fname}_reco_${now}.log
echo '=== End last 100 lines of lar log file ==='

if [ $larExit -eq 0 ] ; then
  # Success !
  echo "$pfn" > justin-processed-pfns.txt
  jobscriptExit=0
else
  # Oh :(
  jobscriptExit=1
fi

ls -lRS

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2026-02-05 13:48:54 UTC       justIN version: 01.06.00