Workflow 2857, Stage 1
Priority | 50 |
Processors | 1 |
Wall seconds | 7200 |
Image | /cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest |
RSS bytes | 2097152000 (2000 MiB) |
Max distance for inputs | 100.0 |
Enabled input RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
Enabled output RSEs |
CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC |
Enabled sites |
BR_CBPF, CA_SFU, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Glasgow, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin |
Scope | usertests |
Events for this stage |
Output patterns
| Destination | Pattern | Lifetime | For next stage | RSE expression |
---|
1 | https://fndcadoor.fnal.gov:2880/dune/scratch/users/fandrian/fnal/02857/1 | scint_*.root | | | |
Environment variables
Name | Value |
---|
INPUT_TAR_DIR | /cvmfs/fifeuser1.opensciencegrid.org/sw/dune/7c4727f72cabe959f50f85edc579a11da53e2e68 |
NUM_EVENTS | 100 |
File states
Total files | Finding | Unallocated | Allocated | Outputting | Processed | Not found | Failed |
---|
|
1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
Job states
Total | Submitted | Started | Processing | Outputting | Finished | Notused | Aborted | Stalled | Jobscript error | Outputting failed | None processed |
---|
33 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 15 | 0 | 0 | 17 |
Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)
Jobscript
#!/bin/bash
:<<'EOF'
To use this jobscript to process files from the dataset semi_ana_v10_09_00d00_pdvd
Use this command to create the workflow:
justin simple-workflow \
--mql \
"files from dune:all where core.file_type=detector and core.run_type=hd-protodune and core.data_tier=raw limit 10" \
--jobscript submit_local_fcl.jobscript --rss-mb 4000 --max-distance 30 --scope usertests \
--output-pattern "scint_*.root:$FNALURL/$USERF" \
--env INPUT_TAR_DIR_LOCAL="$INPUT_TAR_DIR_LOCAL" --env NUM_EVENTS=1
The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER
EOF
# fcl file and DUNE software version/qualifier to be used
FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR/to_grid_0/light_generator_pdvd_0.fcl}
#INPUT_STEERING_FILE=${INPUT_STEERING_FILE:-$INPUT_TAR_DIR_LOCAL/to_grid_0/myLightSourceSteering_0.txt}
DUNE_VERSION=${DUNE_VERSION:-v10_09_00d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}
# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
events_option="-n $NUM_EVENTS"
fi
# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
# Construct outFile from input $pfn
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
# Ffname=`echo $pfn | awk -F/ '{print $NF}'`
# fname=`echo $Ffname | awk -F. '{print $1}'`
#outFile=${now}_reco_${now}.root
campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"
(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"
cp $INPUT_TAR_DIR/to_grid_0/light_generator_pdvd_0.fcl /tmp/
cp $INPUT_TAR_DIR/to_grid_0/myLightSourceSteering_0.txt /tmp/
ls /tmp
chmod a+r /tmp/myLightSourceSteering_0.txt
chmod a+r /tmp/light_generator_pdvd_0.fcl
cd /tmp/
#lar -c /tmp/light_generator_pdvd_0.fcl $events_option -o $outFile > ${now}_reco_${now}.log 2>&1
lar -c /tmp/light_generator_pdvd_0.fcl $events_option > scint_0.log 2>&1
rm /tmp/myLightSourceSteering_0.txt
rm /tmp/light_generator_pdvd_0.fcl
)
echo '=== Start last 75 lines of lar log file ==='
tail -75 scint_0.log
echo '=== End last 75 lines of lar log file ==='
# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"
echo $now
if [ $larExit -eq 0 ] ; then
# Success !
echo This job is done with output file $outFile created at $now
jobscriptExit=0
else
# Oh :(
jobscriptExit=1
fi
# Create compressed tar file with all log files
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit