Skip to content

Commit c53b624

Browse files
authored
Merge pull request #5 from omlins/initial
Extend readme
2 parents cafe26b + 7871d34 commit c53b624

File tree

7 files changed

+309
-108
lines changed

7 files changed

+309
-108
lines changed

README.md

Lines changed: 198 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,199 @@
1-
# JUHPC
2-
Create an HPC setup for juliaup, julia and some HPC key packages (MPI.jl, CUDA.jl, HDF5.jl, ADIOS2.jl, ...), including
1+
# JUHPC: a community software development project for everyone - including end users
2+
JUHPC is an attempt to convert the numerous efforts at different HPC sites for defining a suitable Julia HPC setup into a community software development project... *which suits everyone*. The objective is to gather all the experience of the Julia HPC community and transform it in an portable automatic Julia HPC setup, enhanced and maintained jointly by the Julia HPC community. JUHPC can be used stand-alone (by end users) or as part of a recipe for automated software stack generation (by HPC sites) as, e.g., the generation of modules or [uenvs](https://eth-cscs.github.io/uenv/) (used on the ALPS supercomputer at the Swiss National Supercomputing Centre, see [here](https://confluence.cscs.ch/display/KB/UENV+user+environments)).
3+
4+
An important lesson learned by the Julia HPC community for providing Julia at HPC sites is not to preinstall any packages site wide. JUHPC pushes this insight even one step further and does not preinstall Julia either. Instead, juliaup is leveraged and the installation of juliaup, Julia and packages is preconfigured for being automatically executed by the end user.
5+
6+
Concretely, JUHPC creates an HPC setup for juliaup, Julia and some HPC key packages (MPI.jl, CUDA.jl, HDF5.jl, ADIOS2.jl, ...), including
37
- preferences for HPC key packages that require system libraries;
4-
- a wrapper for juliaup that installs juliaup (and latest julia) in an appropriate location (e.g., scratch) if it is not already installed;
5-
- an activation script that sets environment variables for juliaup, julia and HPC key packages;
6-
- optionally executing a site-specific post installation julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script).
8+
- a wrapper for juliaup that will install juliaup (and latest Julia) automatically in a predefined location (e.g., scratch) when the end user calls `juliaup` the first time;
9+
- an activation script that sets environment variables for juliaup, Julia and HPC key packages;
10+
- optional execution of a site-specific post installation Julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script).
11+
12+
HPC sites can install the HPC setup into a folder in a location accessible to all users (which can also be part, e.g., of a uenv). HPC end users can install the HPC setup into any folder to their liking, accessible from the compute nodes; it is then enough to source the activate script in this folder in order to activate the HPC setup.
13+
14+
15+
# It's as simple as that
16+
17+
# 1. Export environment variables for the installation of some HPC key packages
18+
19+
## CUDA
20+
- `JUHPC_CUDA_HOME`: Activates HPC setup for CUDA and is used for CUDA.jl runtime discovery (set as CUDA_HOME in the activate script).
21+
- `JUHPC_CUDA_RUNTIME_VERSION`: Used to set CUDA.jl preferences (fixes runtime version enabling pre-compilation on login nodes).
22+
23+
## AMDGPU
24+
- `JUHPC_ROCM_HOME`: Activates HPC setup for AMDGPU and is used for AMDGPU.jl runtime discovery (set as ROCM_PATH in the activate script).
25+
26+
## MPI
27+
- `JUHPC_MPI_HOME`: Activates HPC setup for MPI and is used to set MPI.jl preferences. Incompatible with `JUHPC_MPI_VENDOR`
28+
- `JUHPC_MPI_VENDOR`: Activates HPC setup for MPI and is used to set MPI.jl preferences (currently only "cray" is valid, see [here](https://juliaparallel.org/MPI.jl/stable/configuration/#Notes-about-vendor-provided-MPI-backends)). Incompatible with `JUHPC_MPI_HOME`.
29+
- `JUHPC_MPI_EXEC`: Used to set MPI.jl preferences (exec command definition). Arguments are space separated, e.g., "srun -C gpu".
30+
31+
## HDF5
32+
- `JUHPC_HDF5_HOME`: Activates HPC setup for HDF5 and is used to set HDF5.jl preferences.
33+
34+
## ADIOS2
35+
- `JUHPC_ADIOS2_HOME`: Activates HPC setup for ADIOS2 and is used to set ADIOS2.jl preferences.
36+
37+
38+
# 2. Call JUHPC
39+
40+
The `juhpc` bash script is called as follows:
41+
```bash
42+
juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR [$JUHPC_POST_INSTALL]
43+
```
44+
I.e., it takes the following arguments:
45+
- `JUHPC_SETUP_INSTALLDIR`: the folder to install the HPC setup into, e.g., `"$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup"` (assuming `$SCRATCH/../julia` is a wipe out protected folder on scratch).
46+
- `JULIAUP_INSTALLDIR`: the folder juliaup and Julia will automatically be installed the first time the end user calls `juliaup`. *User environment variables should be escaped* in order not to have them expanded during HPC setup installation, but during its usage by the end user, e.g., `"\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup"` (assuming `$SCRATCH/../julia` is a wipe out protected folder on scratch).
47+
- `JUHPC_POST_INSTALL` (optional): site-specific post installation Julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script).
48+
49+
> :note: The above examples assume that `$SCRATCH/../julia` is a wipe out protected folder on scratch.
50+
> :note: Separate installation by HOSTNAME is required if different hosts with different architectures share file system used for installation (e.g., daint and eiger on ALPS).
51+
52+
53+
# Examples of HPC setup installations on the ALPS supercomputer (CSCS)
54+
55+
Examples of HPC setup installations are found in the folder `configs` of which two are featured in the following.
56+
57+
## Example 1: using Cray Programming Environment
58+
```bash
59+
# Load required modules (including correct CPU and GPU target modules)
60+
module load cray
61+
module switch PrgEnv-cray PrgEnv-gnu
62+
module load cudatoolkit craype-accel-nvidia90
63+
module load cray-hdf5-parallel
64+
module list
65+
66+
# Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, HDF5.jl and ADIOS2.jl)
67+
export JUHPC_CUDA_HOME=$CUDA_HOME
68+
export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION
69+
export JUHPC_MPI_VENDOR="cray"
70+
export JUHPC_MPI_EXEC="srun -C gpu"
71+
export JUHPC_HDF5_HOME=$HDF5_DIR
72+
73+
# Call JUHPC
74+
git clone https://github.com/JuliaParallel/JUHPC
75+
JUHPC=./JUHPC/src/juhpc
76+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup
77+
JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup"
78+
bash -l $JUHPC $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR
79+
```
80+
81+
## Example 2: using UENV
82+
```bash
83+
# UENV specific environment variables
84+
export ENV_MOUNT={{ env.mount }} # export ENV_MOUNT=/user-environment
85+
export ENV_META=$ENV_MOUNT/meta
86+
export ENV_EXTRA=$ENV_META/extra
87+
export ENV_JSON=$ENV_META/env.json
88+
89+
# Environment variables for HPC key packages that require system libraries (MPI.jl, CUDA.jl, HDF5.jl and ADIOS2.jl)
90+
export JUHPC_CUDA_HOME=$(spack -C $ENV_MOUNT/config location -i cuda)
91+
export JUHPC_CUDA_RUNTIME_VERSION=$(spack --color=never -C $ENV_MOUNT/config find cuda | \
92+
perl -ne 'print $1 if /cuda@([\d.]+)/')
93+
export JUHPC_MPI_HOME=$(spack -C $ENV_MOUNT/config location -i cray-mpich)
94+
export JUHPC_MPI_EXEC="srun -C gpu"
95+
export JUHPC_HDF5_HOME=$(spack -C $ENV_MOUNT/config location -i hdf5)
96+
export JUHPC_ADIOS2_HOME=$(spack -C $ENV_MOUNT/config location -i adios2)
97+
98+
# Call JUHPC
99+
JUHPC_DIR=$ENV_EXTRA/JUHPC
100+
git clone https://github.com/JuliaParallel/JUHPC $JUHPC_DIR
101+
JUHPC=$JUHPC_DIR/src/juhpc
102+
JUHPC_SETUP_INSTALLDIR=$ENV_MOUNT/juhpc_setup
103+
JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup"
104+
JUHPC_POST_INSTALL=$ENV_EXTRA/uenv_view.jl
105+
bash -l $JUHPC $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR $JUHPC_POST_INSTALL
106+
```
107+
108+
# Test of HPC setup installations
109+
110+
## Test of example 1
111+
```bash
112+
#!/bin/bash
113+
114+
# Variable set in craype_config
115+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup
116+
117+
# Load required modules (including correct CPU and GPU target modules)
118+
module load cray
119+
module switch PrgEnv-cray PrgEnv-gnu
120+
module load cudatoolkit craype-accel-nvidia90
121+
module load cray-hdf5-parallel
122+
module list
123+
124+
# Activate the HPC setup environment variables
125+
. $JUHPC_SETUP_INSTALLDIR/activate
126+
127+
# Call juliaup to install juliaup and latest julia on scratch
128+
juliaup
129+
130+
# Call juliaup to see its options
131+
juliaup
132+
133+
# Call julia Pkg
134+
julia -e 'using Pkg; Pkg.status()'
135+
136+
# Add CUDA.jl
137+
julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()'
138+
139+
# Add MPI.jl
140+
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
141+
142+
# Add HDF5.jl
143+
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'
144+
145+
# Test CUDA-aware MPI
146+
cd ~/cudaaware
147+
MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl
148+
```
149+
150+
## Test of example 2
151+
```bash
152+
#!/bin/bash
153+
154+
# Start uenv with the view equivalent to the activation script
155+
uenv start --view=julia julia
156+
157+
# Call juliaup to install juliaup and latest julia on scratch
158+
juliaup
159+
160+
# Call juliaup to see its options
161+
juliaup
162+
163+
# Call julia Pkg
164+
julia -e 'using Pkg; Pkg.status()'
165+
166+
# Add CUDA.jl
167+
julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()'
168+
169+
# Add MPI.jl
170+
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
171+
172+
# Add HDF5.jl
173+
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'
174+
175+
# Test CUDA-aware MPI
176+
cd ~/cudaaware
177+
MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl
178+
```
179+
180+
181+
# Contributors
182+
183+
The initial version of JUHPC was contributed by Samuel Omlin, Swiss National Supercomputing Centre, ETH Zurich (@omlins). The following people have provided valuable contributions over the years in the effort of defining a suitable Julia HPC setup (this is based on the list found [here](https://github.com/hlrs-tasc/julia-on-hpc-systems); please add missing people or let us know):
184+
185+
- Carsten Bauer (@carstenbauer)
186+
- Alexander Bills (@abillscmu)
187+
- Johannes Blaschke (@jblaschke)
188+
- Valentin Churavy (@vchuravy)
189+
- Steffen Fürst (@s-fuerst)
190+
- Mosè Giordano (@giordano)
191+
- C. Brenhin Keller (@brenhinkeller)
192+
- Mirek Kratochvíl (@exaexa)
193+
- Pedro Ojeda (@pojeda)
194+
- Samuel Omlin (@omlins)
195+
- Ludovic Räss (@luraess)
196+
- Erik Schnetter (@eschnett)
197+
- Michael Schlottke-Lakemper (@sloede)
198+
- Dinindu Senanayake (@DininduSenanayake)
199+
- Kjartan Thor Wikfeldt (@wikfeldt)

configs/cscs/alps/gh200/craype_config

Lines changed: 7 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -16,47 +16,16 @@ module list
1616

1717

1818
# Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, HDF5.jl and ADIOS2.jl)
19-
export JUHPC_CUDA_HOME=$CUDA_HOME # Used for CUDA.jl runtime discovery (set as CUDA_HOME in the activate script).
20-
export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION # Used for CUDA.jl runtime version definition (set in preferences).
21-
export JUHPC_ROCM_HOME= # Used for AMDGPU.jl runtime discovery (set as ROCM_PATH in the activate script).
22-
export JUHPC_MPI_HOME= # Used for MPI.jl system binary discovery (set in preferences).
23-
export JUHPC_MPI_VENDOR="cray" # Used for MPI.jl system binary discovery (used to set preferences).
24-
export JUHPC_MPI_EXEC="srun -C gpu" # Used for MPI.jl exec command discovery (set in preferences). Arguments are space separated, e.g. "srun -C gpu".
25-
export JUHPC_HDF5_HOME=$HDF5_DIR # Used for HDF5.jl library discovery (set in preferences).
26-
export JUHPC_ADIOS2_HOME= # Used for ADIOS2.jl library discovery (set as JULIA_ADIOS2_PATH in the activate script).
19+
export JUHPC_CUDA_HOME=$CUDA_HOME
20+
export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION
21+
export JUHPC_MPI_VENDOR="cray"
22+
export JUHPC_MPI_EXEC="srun -C gpu"
23+
export JUHPC_HDF5_HOME=$HDF5_DIR
2724

2825

2926
# Call JUHPC
3027
git clone https://github.com/omlins/JUHPC
3128
JUHPC=./JUHPC/src/juhpc
32-
JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup # HPC setup installation environment variables must be expanded during installation.
33-
JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%-*}/juliaup" # User environment variables SCRATCH and HOSTNAME must not be expanded HPC setup installation, but during usage. Separate installation by HOSTNAME is required, because different hosts with different architectures can share the same file system (e.g., daint and eiger on ALPS).
29+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup
30+
JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%-*}/juliaup"
3431
bash -l $JUHPC $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR
35-
36-
37-
# Activate the HPC setup environment variables
38-
. $JUHPC_SETUP_INSTALLDIR/activate
39-
40-
# Call juliaup to install juliaup and latest julia on scratch
41-
juliaup
42-
43-
# Call juliaup to see its options
44-
juliaup
45-
46-
# Call julia Pkg
47-
julia -e 'using Pkg; Pkg.status()'
48-
49-
# Add CUDA.jl
50-
julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()'
51-
52-
# Add MPI.jl
53-
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
54-
55-
# Add HDF5.jl
56-
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'
57-
58-
# Test CUDA-aware MPI
59-
cd ~/cudaaware
60-
MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl
61-
62-
# julia -e 'import Pkg; Pkg.test("MPI"; test_args=["--backend=CUDA"])'
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
#!/bin/bash
2+
3+
# Variable set in craype_config
4+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup # HPC setup installation environment variables must be expanded during installation.
5+
6+
# Load required modules (including correct CPU and GPU target modules)
7+
module load cray
8+
module switch PrgEnv-cray PrgEnv-gnu
9+
module load cudatoolkit craype-accel-nvidia90
10+
module load cray-hdf5-parallel
11+
module list
12+
13+
# Activate the HPC setup environment variables
14+
. $JUHPC_SETUP_INSTALLDIR/activate
15+
16+
# Call juliaup to install juliaup and latest julia on scratch
17+
juliaup
18+
19+
# Call juliaup to see its options
20+
juliaup
21+
22+
# Call julia Pkg
23+
julia -e 'using Pkg; Pkg.status()'
24+
25+
# Add CUDA.jl
26+
julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()'
27+
28+
# Add MPI.jl
29+
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
30+
31+
# Add HDF5.jl
32+
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'
33+
34+
# Test CUDA-aware MPI
35+
cd ~/cudaaware
36+
MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl

configs/cscs/alps/mc/craype_config

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
#!/bin/bash
2+
3+
# Author: Samuel Omlin, CSCS (omlins)
4+
#
5+
# Description: Definition of site specific variables and call of JUHPC.
6+
# Site: ALPS:eiger, Swiss National Supercomputing Centre (CSCS)
7+
# Base: craype
8+
9+
10+
# Load required modules (including correct CPU and GPU target modules)
11+
module load cray
12+
module switch PrgEnv-cray PrgEnv-gnu
13+
module load cray-hdf5-parallel
14+
module list
15+
16+
17+
# Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, HDF5.jl and ADIOS2.jl)
18+
export JUHPC_MPI_VENDOR="cray"
19+
export JUHPC_MPI_EXEC="srun -C mc"
20+
export JUHPC_HDF5_HOME=$HDF5_DIR
21+
22+
23+
# Call JUHPC
24+
git clone https://github.com/omlins/JUHPC
25+
JUHPC=./JUHPC/src/juhpc
26+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup
27+
JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%-*}/juliaup"
28+
bash -l $JUHPC $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
#!/bin/bash
2+
3+
# Variable set in craype_config
4+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup # HPC setup installation environment variables must be expanded during installation.
5+
6+
# Load required modules (including correct CPU and GPU target modules)
7+
module load cray
8+
module switch PrgEnv-cray PrgEnv-gnu
9+
module load cray-hdf5-parallel
10+
module list
11+
12+
# Activate the HPC setup environment variables
13+
. $JUHPC_SETUP_INSTALLDIR/activate
14+
15+
# Call juliaup to install juliaup and latest julia on scratch
16+
juliaup
17+
18+
# Call juliaup to see its options
19+
juliaup
20+
21+
# Call julia Pkg
22+
julia -e 'using Pkg; Pkg.status()'
23+
24+
# Add MPI.jl
25+
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
26+
27+
# Add HDF5.jl
28+
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'

0 commit comments

Comments
 (0)