justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 298, Stage 1

Priority50
Processors1
Wall seconds28800
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1Rucio usertests:H4_v34b_3GeV_-27.7_merged_100_g4_Stage2-fnal-w298s1p1*g4_Stage2_*.root86400False

Environment variables

NameValue
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser2.opensciencegrid.org/sw/dune/f8b182cbad0b2687070d89050509db799fe2a2f7
nskip0
NUM_EVENTS1

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
10000100

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
200002000000
Files processed000.10.10.20.20.30.30.40.40.50.50.60.60.70.70.80.80.90.911Aug-04 01:00Aug-04 02:00Aug-04 03:00Files processedBin start timesNumber per binNL_NIKHEF
Replicas per RSE1380.00057375369.7499999985656Replicas per RSERAL_ECHO (100%)

RSEs used

NameInputsOutputs
RAL_ECHO10

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash

:<<'EOF'
Jobscript para rodar full sim/reco chain usando arquivo ROOT fixo (INPUT_TFILE).

Este script:
- Usa o arquivo ROOT especificado em INPUT_TFILE como entrada direta no primeiro lar.
- Gera pdhd_gen.root, depois roda runG4Stage1.fcl e runG4Stage2.fcl sequencialmente.
- Cria logs com timestamp para cada etapa.
- Configura ambiente dunesw baseado nas variveis DUNE_VERSION e DUNE_QUALIFIER.
EOF

# --- Variveis configurveis ---

FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/prod_beam_cosmics_7GeV_protodunehd.fcl}
DUNE_VERSION=${DUNE_VERSION:-v10_07_00d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}

# Exporta caminhos para fhicl e plugins
export FHICL_FILE_PATH=${INPUT_TAR_DIR_LOCAL}:${FHICL_FILE_PATH}
export FW_SEARCH_PATH=${INPUT_TAR_DIR_LOCAL}:${FW_SEARCH_PATH}

# Nmero de eventos
if [ "$NUM_EVENTS" != "" ]; then
  events_option="-n $NUM_EVENTS --nskip $nskip"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup do ambiente DUNE
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"

# Preparar nomes para logs e output
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
inputFile=pdhd_gen.root
#outFile=pdhd_g4_Stage1_${now}.root
outFile=pdhd_g4_Stage2_${now}.root
campaign="justIN.w${JUSTIN_WORKFLOW_ID}"


# Executa o pipeline lar
(
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"
lar -c $INPUT_TAR_DIR_LOCAL/prod_beam_cosmics_7GeV_protodunehd.fcl $events_option -o pdhd_gen.root "$pfn" > ${fname}_pdhd_gen_${now}.log 2>&1 \
&& lar -c $INPUT_TAR_DIR_LOCAL/runG4Stage1.fcl -s pdhd_gen.root -o pdhd_g4_Stage1.root > ${fname}_pdhd_g4_Stage1_${now}.log 2>&1 \
#&& lar -c $INPUT_TAR_DIR_LOCAL/runG4Stage2.fcl -s pdhd_g4_Stage1.root -o $outFile > ${fname}_pdhd_g4_Stage2_${now}.log 2>&1
)

# Mostrar ltimas linhas do log final para debug
echo '=== ltimas 100 linhas do log final ==='
tail -100 ${fname}_pdhd_g4_Stage2_${now}.log
#tail -100 ${fname}_pdhd_g4_Stage1_${now}.log
echo '======================================='

# Cdigo de sada baseado no lar final
larExit=$?
echo "[INFO] lar exit code: $larExit"

if [ $larExit -eq 0 ] ; then
  # Success !
  echo "$pfn" > justin-processed-pfns.txt
  jobscriptExit=0
else
  # Oh :(
  jobscriptExit=1
fi

# Create compressed tar file with all log files 
#tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2025-08-04 12:14:15 UTC       justIN version: 01.04.00