"make check" fails on all the mpi tests

here is one example with the shared objects listed at the bottom.
I don’t think this has ever worked right I just didn’t need it for initial testing.
Thanks.

cat advection-TS-2d-PETSc.edp.log
‘/home/ubuntu/dev/freefem/install2/ff-petsc/r/bin/mpiexec’ -np 4 …/…/src/mpi/FreeFem+±mpi -nw ‘./advection-TS-2d-PETSc.edp’
try initfile : freefem++.pref

load path :
…/…/plugin/mpi/

…/…/plugin/seq/
(.)

include path :

…/…/idp/
(.)
try initfile : freefem++.pref

load path :
…/…/plugin/mpi/

…/…/plugin/seq/
(.)

include path :

…/…/idp/
(.)
try initfile : freefem++.pref

load path :
…/…/plugin/mpi/

…/…/plugin/seq/
(.)

include path :

…/…/idp/
(.)
try initfile : freefem++.pref

load path :
…/…/plugin/mpi/

…/…/plugin/seq/
(.)

include path :

…/…/idp/
(.)
initparallele rank 0 on 4
initparallele rank 1 on 4
ARGV 1 -nw initparallele rank 2 on 4
ARGV 1 -nw
ARGV 2 ./advection-TS-2d-PETSc.edp

ARGV 2 ./advection-TS-2d-PETSc.edp
fn: initparallele rank 3 on fn: ./advection-TS-2d-PETSc.edp
./advection-TS-2d-PETSc.edp
ARGV 1 -nw
ARGV 2 ./advection-TS-2d-PETSc.edp
fn: ./advection-TS-2d-PETSc.edp4
ARGV 1 -nw

ARGV 2 ./advection-TS-2d-PETSc.edp
fn: ./advection-TS-2d-PETSc.edp-- FreeFem++ v4.12 (Sun Jan 22 17:51:12 EST 2023 - git no git)

file : ./advection-TS-2d-PETSc.edp verbosity= 5
Load: Load: Load: Load: lg_fem lg_mesh glumesh2D glumesh2D glumesh2D glumesh2D lg_mesh3 eigenvalue parallelempi
PreEnv load :funcTemplate

PreEnv load :funcTemplate

PreEnv load :funcTemplate

PreEnv load :funcTemplate
(load: dlopen …/…/plugin/mpi/funcTemplate.so 0x25cacd0)PreEnv load :myfunction
PreEnv load :myfunction
PreEnv load :myfunction
PreEnv load :myfunction
(load: dlopen …/…/plugin/mpi/myfunction.so 0x25cc460)PreEnv load :MUMPS_seq
PreEnv load :MUMPS_seq
PreEnv load :MUMPS_seq
PreEnv load :MUMPS_seq
init MUMPS_SEQ: MPI_Init
init MUMPS_SEQ: MPI_Init
init MUMPS_SEQ: MPI_Init
init MUMPS_SEQ: MPI_Init
Fatal error in internal_Init: Other MPI error, error stack:
internal_Init(59): MPI_Init(argc=0x7ffce8e1ae0c, argv=0x7ffce8e1ae10) failed
internal_Init(39): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in internal_Init: Other MPI error, error stack:
internal_Init(59): MPI_Init(argc=0x7fff97490a6c, argv=0x7fff97490a70) failed
internal_Init(39): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in internal_Init: Other MPI error, error stack:
internal_Init(59): MPI_Init(argc=0x7ffe9b7d4a6c, argv=0x7ffe9b7d4a70) failed
internal_Init(39): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in internal_Init: Other MPI error, error stack:
internal_Init(59): MPI_Init(argc=0x7fff34e525bc, argv=0x7fff34e525c0) failed
internal_Init(39): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ ldd /home/ubuntu/dev/freefem/install2/ff-petsc/r/bin/mpiexec
linux-vdso.so.1 => (0x00007ffe804b0000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f198bf2d000)
libudev.so.1 => /lib/x86_64-linux-gnu/libudev.so.1 (0x00007f198c411000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f198bd10000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f198b946000)
/lib64/ld-linux-x86-64.so.2 (0x00007f198c236000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f198b73e000)
marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ marchywka@happy:/home/ubuntu/dev/freefem/FreeFem-sources-master/examples/hpddm$ ldd …/…/src/mpi/FreeFem+±mpi
linux-vdso.so.1 => (0x00007fffe954d000)
libumfpack.so.5 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libumfpack.so.5 (0x00007fb17ed28000)
libcholmod.so.3 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libcholmod.so.3 (0x00007fb17ea1d000)
liblapack.so.3 => /usr/lib/liblapack.so.3 (0x00007fb17e225000)
libblas.so.3 => /usr/lib/libblas.so.3 (0x00007fb17dfe7000)
libmpi.so.12 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libmpi.so.12 (0x00007fb17d8dd000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fb17d6d9000)
libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fb17d3ae000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fb17d0a5000)
libhdf5_serial.so.10 => /usr/lib/x86_64-linux-gnu/libhdf5_serial.so.10 (0x00007fb17cc07000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fb17c9ea000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fb17c668000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fb17c452000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fb17c088000)
libamd.so.2 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libamd.so.2 (0x00007fb17be7e000)
libsuitesparseconfig.so.5 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libsuitesparseconfig.so.5 (0x00007fb17bc7b000)
libcolamd.so.2 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libcolamd.so.2 (0x00007fb17ba73000)
libccolamd.so.2 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libccolamd.so.2 (0x00007fb17b866000)
libcamd.so.2 => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libcamd.so.2 (0x00007fb17b65a000)
libmetis.so => /home/ubuntu/dev/freefem/install2/ff-petsc/r/lib/libmetis.so (0x00007fb17b3e8000)
libatlas.so.3 => /usr/lib/libatlas.so.3 (0x00007fb17ae4a000)
libudev.so.1 => /lib/x86_64-linux-gnu/libudev.so.1 (0x00007fb17f1c4000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fb17ac42000)
/lib64/ld-linux-x86-64.so.2 (0x00007fb17eff0000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fb17aa03000)
libsz.so.2 => /usr/lib/x86_64-linux-gnu/libsz.so.2 (0x00007fb17a800000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fb17a5e6000)
libaec.so.0 => /usr/lib/x86_64-linux-gnu/libaec.so.0 (0x00007fb17a3de000)

Can you run a simple Hello world in plain C (e.g., MPI Hello World · MPI Tutorial) with /home/ubuntu/dev/freefem/install2/ff-petsc/r/bin/mpicc and /home/ubuntu/dev/freefem/install2/ff-petsc/r/bin/mpiexec?

That much seems to work, see below. The problem appears to be that the things with the
_seq extension also init mpi suggesting a build setting is wrong. However, turning off mpi
with --with-mpi-no seems to disable petsc even after reconfigure.

echo ffdir /home/ubuntu/dev/freefem/install2 marchywka@happy:/home/documents/cpp/proj/freefem $ffdir/…/install2/ff-petsc/r/bin/mpicc -I $ffdir/ff-petsc/r/include hw.c -o hw.out -L$ffdir/ff-petsc/r/lib/ -lmpi
marchywka@happy:/home/documents/cpp/proj/freefem$ ffdir/../install2/ff-petsc/r/bin/mpirun -np 2 ./hw.out Hello world from processor happy, rank 0 out of 2 processors Hello world from processor happy, rank 1 out of 2 processors marchywka@happy:/home/documents/cpp/proj/freefem

cat hw.c
#include <mpi.h>
#include <stdio.h>

int main(int argc, char** argv) {
// Initialize the MPI environment
MPI_Init(NULL, NULL);

// Get the number of processes
int world_size;
MPI_Comm_size(MPI_COMM_WORLD, &world_size);

// Get the rank of the process
int world_rank;
MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

// Get the name of the processor
char processor_name[MPI_MAX_PROCESSOR_NAME];
int name_len;
MPI_Get_processor_name(processor_name, &name_len);

// Print off a hello world message
printf("Hello world from processor %s, rank %d out of %d processors\n",
       processor_name, world_rank, world_size);

// Finalize the MPI environment.
MPI_Finalize();

}

The editor made a mess of my output, here it is with the blockquote things,

echo ffdir /home/ubuntu/dev/freefem/install2 marchywka@happy:/home/documents/cpp/proj/freefem $ffdir/…/install2/ff-petsc/r/bin/mpicc -I $ffdir/ff-petsc/r/include hw.c -o hw.out -L$ffdir/ff-petsc/r/lib/ -lmpi
marchywka@happy:/home/documents/cpp/proj/freefem$ $ffdir/…/install2/ff-petsc/r/bin/mpirun -np 2 ./hw.out
Hello world from processor happy, rank 0 out of 2 processors
Hello world from processor happy, rank 1 out of 2 processors

oh I guess hits is “preformatted” test as if there is anything else :slight_smile:

echo $ffdir
/home/ubuntu/dev/freefem/install2
marchywka@happy:/home/documents/cpp/proj/freefem$ $ffdir/../install2/ff-petsc/r/bin/mpicc  -I $ffdir/ff-petsc/r/include hw.c -o hw.out -L$ffdir/ff-petsc/r/lib/ -lmpi
marchywka@happy:/home/documents/cpp/proj/freefem$ $ffdir/../install2/ff-petsc/r/bin/mpirun -np 2 ./hw.out 
Hello world from processor happy, rank 0 out of 2 processors
Hello world from processor happy, rank 1 out of 2 processors

Well, I copied one of the examples into my working directory and it looks like it runs
now. The “make check” failed on the last run and I have not gone back to look at it.
I guess I’m not sure when to use ff-mpirun and FreeFem+±mpi I had
use mpirun to run FreeFrem and it did not like that but the below runs
AFAICT so I should be able to figure it out now. Thanks.

 ff-mpirun -np 4  Helmholtz_circle_Neumann.edp  | tail -n 20 
 NbTreeBox = 367 Nb Vertices = 1000
 NbTreeBoxSearch 0 NbVerticesSearch 0 NbSearch 0 ratio: 0
 SizeOf QuadTree32600

times: compile 0.017938s, execution 4.97868s,  mpirank:0
 ######## We forget of deleting   220 Nb pointer,   0Bytes  ,  mpirank 0, memory leak =21728
 CodeAlloc : nb ptr  4914,  size :605192 mpirank: 0
Ok: Normal End
~GTree  the tree 
 NbTreeBox = 367 Nb Vertices = 1000
 NbTreeBoxSearch 0 NbVerticesSearch 0 NbSearch 0 ratio: 0
 SizeOf QuadTree32600

times: compile 0.014614s, execution 4.98351s,  mpirank:1
 ######## We forget of deleting   8 Nb pointer,   0Bytes  ,  mpirank 1, memory leak =2816
 CodeAlloc : nb ptr  4914,  size :605192 mpirank: 1
FreeFem++-mpi finalize correctly .
FreeFem++-mpi finalize correctly .
FreeFem++-mpi finalize correctly .
FreeFem++-mpi finalize correctly .
marchywka@happy:/home/documents/cpp/proj/freefem/play$