Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update openmpi on Nautilus to 4.1.6 #1014

2 changes: 1 addition & 1 deletion .github/workflows/ubuntu-rnd-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ jobs:
ls -l /home/ubuntu/spack-stack/CI/unified-env/${TODAY}/modulefiles/Core

module use /home/ubuntu/spack-stack/CI/unified-env/${TODAY}/modulefiles/Core
module load stack-intel/2022.1.0
module load stack-intel/2021.6.0
module load stack-intel-oneapi-mpi/2021.6.0
module load stack-python/3.10.13
module available
Expand Down
10 changes: 6 additions & 4 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
[submodule "spack"]
path = spack
#url = https://github.com/spack/spack
#branch = develop
url = https://github.com/jcsda/spack
branch = jcsda_emc_spack_stack
##url = https://github.com/spack/spack
##branch = develop
#url = https://github.com/jcsda/spack
#branch = jcsda_emc_spack_stack
url = https://github.com/rhoneyager-tomorrow/spack-1
branch = bugfix/hdf5_fpe
[submodule "doc/CMakeModules"]
path = doc/CMakeModules
url = https://github.com/noaa-emc/cmakemodules
Expand Down
9 changes: 4 additions & 5 deletions configs/sites/nautilus/packages.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@ packages:
providers:
# For now need to enable one or the other;
# see https://github.com/JCSDA/spack-stack/issues/659
mpi:: [[email protected]]
#mpi:: [[email protected]]
mpi:: [[email protected]]
blas:: [intel-oneapi-mkl]
fftw-api:: [intel-oneapi-mkl]
lapack:: [intel-oneapi-mkl]
Expand All @@ -25,10 +24,10 @@ packages:
# prefix: /p/app/compilers/intel/oneapi
openmpi:
externals:
- spec: [email protected].5rc2%[email protected]~cuda~cxx~cxx_exceptions~java~memchecker+pmi~static~wrapper-rpath fabrics=ucx schedulers=slurm
prefix: /p/app/penguin/openmpi/4.1.5rc2/intel
- spec: [email protected].6%[email protected]~cuda~cxx~cxx_exceptions~java~memchecker+pmi~static~wrapper-rpath fabrics=ucx schedulers=slurm
prefix: /p/app/penguin/openmpi/4.1.6/intel-classic-2022.0.2
modules:
- penguin/openmpi/4.1.5rc2/intel
- penguin/openmpi/4.1.6/intel-classic-2022.0.2
- slurm
- spec: [email protected]%[email protected]~cuda~cxx~cxx_exceptions~java~memchecker+pmi~static~wrapper-rpath fabrics=ucx schedulers=slurm
prefix: /p/app/penguin/openmpi/4.1.4/aoc
Expand Down
6 changes: 3 additions & 3 deletions doc/source/PreConfiguredSites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,7 @@ With Intel, the following is required for building new spack environments and fo

module load slurm
module load intel/compiler/2022.0.2
module load penguin/openmpi/4.1.5rc2/intel
module load penguin/openmpi/4.1.6/intel-classic-2022.0.2

module use /p/app/projects/NEPTUNE/spack-stack/modulefiles
module load ecflow/5.8.4
Expand All @@ -321,9 +321,9 @@ For ``spack-stack-1.6.0`` with Intel, proceed with loading the following modules

.. code-block:: console

module use /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core
module use /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.6.0/envs/ue-openmpi416/install/modulefiles/Core
module load stack-intel/2021.5.0
module load stack-openmpi/4.1.5rc2
module load stack-openmpi/4.1.6
module load stack-python/3.10.13

With AMD clang/flang (aocc), the following is required for building new spack environments and for using spack to build and run software.
Expand Down
Loading