Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FTE in feelpp_test_integration_relatedmesh #498

Closed
prudhomm opened this issue Jan 4, 2015 · 1 comment
Closed

FTE in feelpp_test_integration_relatedmesh #498

prudhomm opened this issue Jan 4, 2015 · 1 comment

Comments

@prudhomm
Copy link
Member

prudhomm commented Jan 4, 2015

 ./feelpp_test_integration_relatedmesh 
Running 2 test cases...
[read msh from file] Time : 0.030082s
energyElementA 0.36 [0.36]
energyElementB 0.36 [0.36]
energyElementC 1 [1]
energyElementD 1 [1]
energyFacesStandart 2.4 [2.4]
energyFacesNonStandartA 2.4 [2.4]
energyFacesNonStandartB 2.4 [2.4]
energyFacesNonStandartC 2.4 [2.4]
energyFacesNonStandartD 9.6 [9.6]
energyFacesNonStandartE 2.4 [2.4]
[read msh from file] Time : 0.0114882s
energyElementA 0.36 [0.36]
energyElementB 0.36 [0.36]
energyElementC 1 [1]
energyElementD 1 [1]
energyFacesStandart 2.4 [2.4]
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Argument out of range!
[0]PETSC ERROR: Inserting a new nonzero at (1854,190) in the matrix!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./feelpp_test_integration_relatedmesh on a linux-gnu-c-opt named irma-atlas by prudhomm Sun Jan  4 09:33:22 2015
[0]PETSC ERROR: Libraries linked from /build/petsc-lccVo9/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib
[0]PETSC ERROR: Configure run at Wed Aug 27 14:19:17 2014
[0]PETSC ERROR: Configure options --with-shared-libraries --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1 --with-blacs-include=/usr/include --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1 --with-mumps-include=/usr/include --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-cholmod=1 --with-cholmod-include=/usr/include/suitesparse --with-cholmod-lib=/usr/lib/libcholmod.so --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="[/usr/lib/libptesmumps.so,/usr/lib/libptscotch.so,/usr/lib/libptscotcherr.so]" --with-fftw=1 --with-fftw-include=/usr/include --with-fftw-lib="[/usr/lib/x86_64-linux-gnu/libfftw3.so,/usr/lib/x86_64-linux-gnu/libfftw3_mpi.so]" --with-hdf5=1 --with-hdf5-dir=/usr/lib/x86_64-linux-gnu/hdf5/openmpi --CXX_LINKER_FLAGS=-Wl,--no-as-needed
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: MatSetValues_SeqAIJ() line 352 in src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: MatSetValues() line 1106 in src/mat/interface/matrix.c
[0]PETSC ERROR: addMatrix() line 829 in "unknowndirectory/"/home/u2/prudhomm/Devel/FEEL/feelpp.git/feel/feelalg/matrixpetsc.cpp
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 63.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
@prudhomm
Copy link
Member Author

prudhomm commented Jan 4, 2015

Note that it fails only in sequential. It runs ok in parallel!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants