Failed installation test #3512
Unanswered
Nickfu0911
asked this question in
Firedrake support
Replies: 1 comment 2 replies
-
The issue causing errors in executing a Firedrake code on a Mac in parallel is related to PETSc issue #1569. While serial Firedrake simulations work, parallel execution is currently problematic. If you need to run codes in parallel, a workaround is to use a Firedrake Docker image. Otherwise, your installation should handle serial codes perfectly. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I successfully installed Firedrake and tried to run the installation tests. But it failed.
In the terminal it shows:
======================================================================================== test session starts =========================================================================================
platform darwin -- Python 3.11.9, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/nf419/firedrake/src/firedrake
configfile: setup.cfg
plugins: anyio-4.3.0, mpi-0.1, xdist-3.5.0, nbval-0.11.0
collected 3517 items / 3490 deselected / 1 skipped / 27 selected
tests/regression/test_dg_advection.py .F.F [ 14%]
tests/regression/test_poisson_strong_bcs.py ................F [ 77%]
tests/regression/test_poisson_strong_bcs_nitsche.py .... [ 92%]
tests/regression/test_stokes_mini.py .. [100%]
============================================================================================== FAILURES ==============================================================================================
___________________________________________________________________________ test_dg_advection_icosahedral_sphere_parallel ____________________________________________________________________________
args = (), kwargs = {}
../pytest-mpi/pytest_mpi.py:192:
input = None, capture_output = False, timeout = None, check = True, popenargs = (['mpiexec', '-n', '1', '-genv', '_PYTEST_MPI_CHILD_PROCESS', '1', ...],), kwargs = {}
process = <Popen: returncode: 59 args: ['mpiexec', '-n', '1', '-genv', '_PYTEST_MPI_CH...>, stdout = None, stderr = None, retcode = 59
E subprocess.CalledProcessError: Command '['mpiexec', '-n', '1', '-genv', '_PYTEST_MPI_CHILD_PROCESS', '1', 'python', '-m', 'pytest', '--runxfail', '-s', '-q', '/Users/nf419/firedrake/src/firedrake/tests/regression/test_dg_advection.py::test_dg_advection_icosahedral_sphere_parallel', ':', '-n', '2', 'python', '-m', 'pytest', '--runxfail', '-s', '-q', '/Users/nf419/firedrake/src/firedrake/tests/regression/test_dg_advection.py::test_dg_advection_icosahedral_sphere_parallel', '--tb=no', '--no-summary', '--no-header', '--disable-warnings', '--show-capture=no']' returned non-zero exit status 59.
/opt/homebrew/Cellar/[email protected]/3.11.9/Frameworks/Python.framework/Versions/3.11/lib/python3.11/subprocess.py:571: CalledProcessError
---------------------------------------------------------------------------------------- Captured stderr call ----------------------------------------------------------------------------------------
firedrake:WARNING OMP_NUM_THREADS is not set or is set to a value greater than 1, we suggest setting OMP_NUM_THREADS=1 to improve performance
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
And very similar error messages for the other tests.
May I ask what's the main problem here?
I also tried to run PETSC make check, and it gives:
Running PETSc check examples to verify correct installation
Using PETSC_DIR=/Users/nf419/firedrake/src/petsc and PETSC_ARCH=default
C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/tutorials/ex19 run successfully with HYPRE
3c3
< 0 KSP Residual norm 0.235858
5,9c5,52
< 1 SNES Function norm 6.81968e-05
< 0 KSP Residual norm 2.30906e-05
< 1 KSP Residual norm < 1.e-11
< 2 SNES Function norm < 1.e-11
< Number of SNES iterations = 2
Beta Was this translation helpful? Give feedback.
All reactions