Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 9 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@

DUNE Computing Training 2024 Update
========================================
DUNE Computing Tutorial Basics for DUNE - Revised 2025 edition
==============================================================

This repository holds the source code of the webpage that is rendered [here]({{ site.baseurl }}/index.html).

Expand All @@ -9,32 +9,29 @@ This training module is part of an initiative of the [DUNE Computing Consortium]

When:

2024 revisions to online version
2025 revisions to online version

Live versions are delivered 1-2 times/year but this document can also be worked through on your own.

Learn the basics of DUNE computing: storage spaces, data management, LArSoft, grid job submission
Learn the basics of DUNE computing: storage spaces and data management.

### Live sessions

New! Lectures will be recorded.
There will be hands-on, facilitated with a Q&A on a live doc.
New! There will be quizzes and special sessions “expert in the room” to answer questions of beginners and not-so-beginner about their code.
Lectures will be recorded and embedded in associated training episodes.

How to do these tutorials.
There will be hands-on, facilitated with a Q&A on a livedoc.

Participants must have a valid FNAL or CERN account.
Participants must have a valid FNAL or CERN account to work through the examples provided.


Participants must have a valid FNAL or CERN account. The Indico site will be announced is [https://indico.fnal.gov/event/59762/][indico-event]
An Indico site for this event will be announced.

Apply immediately if you do not yet have accounts at either lab (info).

Questions?

Contact the organizers at: dune-computing-training@fnal.gov

New slack channel: #computing_training_basics
Slack channel: #computing_training_basics

## Contributing

Expand Down
12 changes: 8 additions & 4 deletions _episodes/03-data-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,11 +109,15 @@ and when searching for specific types of data
- `core.data_stream` (physics, calibration, cosmics)
- `core.runs[any]=<runnumber>`

You probably also want to know about
For processed data you also need to know about

- `core.application.version` (version of code run)
- `dune.config_file` (configuration file for the reconstruction)
- `dune.config_file` (configuration file for the reconstruction/simulation)
- `dune_mc.gen_fcl_filename` (configuration for the initial simulation physics)
- `dune.output_status` (This should be 'confirmed' for processed files. If it is not, the file likely never got stored.)
- `core.data_tier` (what kind of output is it?)



### Example of doing a metacat search

Expand Down Expand Up @@ -227,7 +231,7 @@ and v09_91_02d01).
> **If you are doing real analysis please use the [official datasets](#Official_Datasets) which experts have defined**
>
> if no official dataset exists, you need to require additional fields like:
> `core.application.version=v09_91_02d01` and `dune.config_file=standard_reco_stage2_calibration_protodunehd_keepup.fcl` to make certain you are not looking at 2 versions of the same file.
> `dune.output_status=confirmed` and `core.application.version=v09_91_02d01` and `dune.config_file=standard_reco_stage2_calibration_protodunehd_keepup.fcl` to make certain that the job that created the file actually wrote the output back to storage and you are not looking at 2 versions of the same file.
{: .callout}


Expand Down Expand Up @@ -345,7 +349,7 @@ You can also do keyword/value queries like the ones above using the Other tab on

### find out how much data there is in a dataset

Do a query using the `-s` or `--summary` option
Do a query of a dataset using the `-s` or `--summary` option

~~~
metacat query -s "files from fardet-vd:fardet-vd__full-reconstructed__v09_81_00d02__reco2_dunevd10kt_anu_1x8x6_3view_30deg_geov3__prodgenie_anu_numu2nue_nue2nutau_dunevd10kt_1x8x6_3view_30deg__out1__v2_official"
Expand Down
2 changes: 1 addition & 1 deletion _extras/pnfs2xrootd.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ done

echo
~~~
{: ..language-bash}
{: ..language-bash}
21 changes: 14 additions & 7 deletions _includes/al9_setup_2025a.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,24 @@
~~~
# find a spack environment and set it up
# setup spack (pre spack 1.0 version)
# setup spack

# this is for spack v1.1
echo "setup-prototype.sh"
. /cvmfs/dune.opensciencegrid.org/spack/setup-env.sh
spack env activate dune-prototype
echo "Activated dune-prototype"

source /cvmfs/dune.opensciencegrid.org/spack/setup-env.sh
echo "Activate dune-workflow"
spack env activate dune-workflow
echo "load GCC and CMAKE so don't use system"
echo "GCC"
spack load gcc@12.5.0 arch=linux-almalinux9-x86_64_v2
echo "CMAKE"
spack load cmake

echo "PY-PIP"
spack load py-pip@23.1.2%gcc@11.4.1 arch=linux-almalinux9-x86_64_v3

echo "load GCC and CMAKE so don't use system"
echo "GCC"
spack load gcc@12.5.0 arch=linux-almalinux9-x86_64_v2
echo "CMAKE"
spack load cmake
~~~
{: .language-bash}

Expand Down
1 change: 0 additions & 1 deletion setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -710,7 +710,6 @@ You should then be able to proceed with much of the tutorial thanks to the wonde
Set up the DUNE software

~~~
export UPS_OVERRIDE="-H Linux64bit+3.10-2.17" # makes certain you get the right UPS
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
~~~
{: .language-bash}
Expand Down