Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Field 3D operator out of bounds possible issue with vorticity. #269

Open
Theosaurus2 opened this issue Oct 30, 2024 · 0 comments
Open

Field 3D operator out of bounds possible issue with vorticity. #269

Theosaurus2 opened this issue Oct 30, 2024 · 0 comments

Comments

@Theosaurus2
Copy link

Hi,

I was trying to setup a 1D simulation with evolving density, pressure and momentum for AR,AR+ and e with a simple sheath boundary, collisions, and vorticity to evolve the potential.
I've attached my BOUT.inp file bellow.
BOUT.txt

I used -DCHECK=3 to get the bellow errors:

  1. Whilst running ctest for the vorticity test i got the error:

Run started at : Tue Oct 29 14:57:06 2024
Option restart = false (default)
Option append = false (default)
Option dump_on_restart = 1 (default)

Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.


mpirun noticed that process rank 0 with PID 0 on node sheath exited on signal 11 (Segmentation fault).

As for using DCHECK=3 i have two separate errors that might help.

  1. Whilst running a ctest to test the build i got an error in the vorticity test as stated bellow:

Error encountered: Assertion failed in /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/region.hxx, line 288: std::abs(dy) < ny====== Exception path ======
[bt] #13 ./hermes-3() [0x574e9e]
_start at ??:?
[bt] #12 /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3) [0x7fcae0581083]
__libc_start_main at /build/glibc-LcI20x/glibc-2.31/csu/../csu/libc-start.c:342
[bt] #11 ./hermes-3() [0x578c3d]
main at /scratch/tc1447/hermes2/hermes-3/hermes-3.cxx:375 (discriminator 15)
[bt] #10 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN6Solver5solveEid+0x783) [0x7fcae847c715]
Solver::solve(int, double) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/solver/solver.cxx:548 (discriminator 7)
[bt] #9 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN6Solver7run_rhsEdb+0x319) [0x7fcae848237b]
Solver::run_rhs(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/solver/solver.cxx:1398
[bt] #8 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN12PhysicsModel6runRHSEdb+0x3d) [0x7fcae83fe24d]
PhysicsModel::runRHS(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/physics/physicsmodel.cxx:114
[bt] #7 ./hermes-3(_ZN12PhysicsModel3rhsEdb+0x36) [0x5972d0]
PhysicsModel::rhs(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/physicsmodel.hxx:271
[bt] #6 ./hermes-3() [0x577823]
Hermes::rhs(double) at /scratch/tc1447/hermes2/hermes-3/hermes-3.cxx:265
[bt] #5 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/libhermes-3-lib.so(_ZN18ComponentScheduler9transformER7Options+0xeb) [0x7fcae8a8d315]
ComponentScheduler::transform(Options&) at /scratch/tc1447/hermes2/hermes-3/src/component_scheduler.cxx:57 (discriminator 3)
[bt] #4 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/libhermes-3-lib.so(_ZN9Vorticity7finallyERK7Options+0xd1c) [0x7fcae8bd955c]
Vorticity::finally(Options const&) at /scratch/tc1447/hermes2/hermes-3/src/vorticity.cxx:702 (discriminator 3)
[bt] #3 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(ZN2FV15Div_a_Grad_perpERK7Field3DS2+0x818) [0x7fcae81fb9c7]
FV::Div_a_Grad_perp(Field3D const&, Field3D const&) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/mesh/fv_ops.cxx:111 (discriminator 3)
[bt] #2 ./hermes-3(_ZNK11SpecificIndIL8IND_TYPE0EE2ypEi+0x5a) [0x68a8b8]
SpecificInd<(IND_TYPE)0>::yp(int) const at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/region.hxx:288 (discriminator 3)
[bt] #1 ./hermes-3(ZN13BoutExceptionC2IA42_cJA75_ciA18_cEEERKT_DpRKT0+0x10a) [0x68ae74]
BoutException::BoutException<char [42], char [75], int, char [18]>(char const (&) [42], char const (&) [75], int const&, char const (&) [18]) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/boutexception.hxx:23 (discriminator 4)
====== Back trace ======
-> virtual void Vorticity::finally(const Options&) on line 636 of '/scratch/tc1447/hermes2/hermes-3/src/vorticity.cxx'

====== Exception thrown ======
Assertion failed in /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/region.hxx, line 288: std::abs(dy) < ny


MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

0% tests passed, 1 tests failed out of 1

Total Test time (real) = 43.79 sec

The following tests FAILED:
6 - vorticity (Failed)
Errors while running CTest

  1. Whilst running using the BOUT.inp file i got the following:

Error encountered: Field3D: (-1, 2, 0) operator out of bounds (1, 260, 1)====== Exception path ======
[bt] #12 ./hermes-3() [0x574e9e]
_start at ??:?
[bt] #11 /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3) [0x7f5a11dda083]
__libc_start_main at /build/glibc-LcI20x/glibc-2.31/csu/../csu/libc-start.c:342
[bt] #10 ./hermes-3() [0x578c3d]
main at /scratch/tc1447/hermes2/hermes-3/hermes-3.cxx:375 (discriminator 15)
[bt] #9 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN6Solver5solveEid+0x783) [0x7f5a19cd5715]
Solver::solve(int, double) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/solver/solver.cxx:548 (discriminator 7)
[bt] #8 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN6Solver7run_rhsEdb+0x319) [0x7f5a19cdb37b]
Solver::run_rhs(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/solver/solver.cxx:1398
[bt] #7 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/external/BOUT-dev/lib/libbout++.so.5.1.0(_ZN12PhysicsModel6runRHSEdb+0x3d) [0x7f5a19c5724d]
PhysicsModel::runRHS(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/src/physics/physicsmodel.cxx:114
[bt] #6 ./hermes-3(_ZN12PhysicsModel3rhsEdb+0x36) [0x5972d0]
PhysicsModel::rhs(double, bool) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/physicsmodel.hxx:271
[bt] #5 ./hermes-3() [0x577823]
Hermes::rhs(double) at /scratch/tc1447/hermes2/hermes-3/hermes-3.cxx:265
[bt] #4 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/libhermes-3-lib.so(_ZN18ComponentScheduler9transformER7Options+0x6c) [0x7f5a1a2e6296]
ComponentScheduler::transform(Options&) at /scratch/tc1447/hermes2/hermes-3/src/component_scheduler.cxx:53 (discriminator 3)
[bt] #3 /scratch/tc1447/hermes2/hermes-3/build-petsc-debug/libhermes-3-lib.so(_ZN9Vorticity9transformER7Options+0x12e2) [0x7f5a1a42e3da]
Vorticity::transform(Options&) at /scratch/tc1447/hermes2/hermes-3/src/vorticity.cxx:381 (discriminator 1)
[bt] #2 ./hermes-3(_ZN7Field3DclEiii+0xf7) [0x596221]
Field3D::operator()(int, int, int) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/field3d.hxx:364
[bt] #1 ./hermes-3(ZN13BoutExceptionC1IA70_cJiiiiiiEEERKT_DpRKT0+0x157) [0x5a73b1]
BoutException::BoutException<char [70], int, int, int, int, int, int>(char const (&) [70], int const&, int const&, int const&, int const&, int const&, int const&) at /scratch/tc1447/hermes2/hermes-3/external/BOUT-dev/include/bout/boutexception.hxx:23 (discriminator 4)
====== Back trace ======
-> virtual void Vorticity::transform(Options&) on line 222 of '/scratch/tc1447/hermes2/hermes-3/src/vorticity.cxx'

====== Exception thrown ======
Field3D: (-1, 2, 0) operator out of bounds (1, 260, 1)


MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

I'm not sure if I've set something up wrong or if this is a bug

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant