Trouble installing FreeFEM with full plugin support on HPC cluster

Dear FreeFEM users,

I’m struggling with installing FreeFEM on the HPC facilities of my department.

I tried several approaches that almost completed successfully, but all of them share the same issue: the build finishes, but many plugins are missing (e.g., mmg, mshmet, ipopt, tetgen, …).

As a first attempt, I compiled directly from the sources using the commands suggested in this issue:

spack load gcc@12.1.0 # to load the correct compilation toolchain

git clone https://github.com/FreeFem/FreeFem-sources.git
cd FreeFem-sources
git checkout develop
autoreconf -vif
mkdir _prefix
./configure --enable-download --enable-optim --prefix=$(pwd)/_prefix
./3rdparty/getall -a
cd 3rdparty/ff-petsc
make petsc-slepc
cd -
./reconfigure
make

I also tried following the instructions in this tutorial, but the result was the same: the core of FreeFEM builds fine, but most of the optional plugins are missing.

I noticed that FreeFEM versions up to 4.14 are available via Spack. Using that approach, I was able to get at least the mshmet plugin working, by modifying the package.py and adding the --enable-download-mshmet flag at the appropriate place.

Thank you very much for your help — I’m happy to provide more details on the different builds I attempted if needed.

Best regards,
Giacomo

You need to send config.log, we have no crystal ball to guess what is going on on your machine.

Sure! This is the file config.log in the FreeFem-sources directory config.log (417.6 KB)

I’ve compiled it with the following commands

spack load gcc@12.1.0 # to load the correct compilation toolchain

git clone https://github.com/FreeFem/FreeFem-sources.git
cd FreeFem-sources
git checkout develop
autoreconf -vif
mkdir _prefix
./configure --enable-download --enable-optim --prefix=$(pwd)/_prefix
./3rdparty/getall -a
cd 3rdparty/ff-petsc
make petsc-slepc
cd -
./reconfigure
make

Thanks again,

Giacomo

If you are using a HPC machine, there is very little reason to use the MPI implementation that was installed by the vendor. Why are you compiling MPICH yourself instead of using the implementation from your system?

Thank you for your reply!

You’re right, our HPC system already provides a working MPI installation, and I understand that using the system’s MPI is generally the recommended approach.

I didn’t explicitly use the --download-mpich option, but I realize now that the internal ff-petsc build procedure (triggered via make petsc-slepc) automatically downloads and compiles MPICH along with other libraries from source.

What would be the recommended way to recompile FreeFEM and its PETSc plugin while linking to the existing MPI implementation available on the system, instead of building MPICH from scratch?

automatically downloads and compiles MPICH along with other libraries from source.

That only happens if FreeFEM is not able to detect MPI in the first place. So what you should do first is make sure that you provide the proper flags to FreeFEM ./configure so that it detects your system MPI. What I would highly suggest, though, is to compile PETSc outside of FreeFEM and not rely on make petsc-slepc as explained in the tutorial that you linked previously. It appears you tried that, but again, without PETSc configure.log and FreeFEM config.log, I cannot say why things are not working.

Thanks @prj for the clarification!
I’ll retry the installation following the approach you suggested and I’ll let you know how it goes

Have a good weekend

As suggested, I compiled PETSc externally (not using make petsc-slepc) and made sure MPI was correctly loaded before configuring FreeFEM:

./configure \
  --with-mpi-dir=$MPI_DIR \
  --with-blas-lib="${OPENBLAS_DIR}/lib/libopenblas.so" \
  --with-blas-include="${OPENBLAS_DIR}/include" \
  --with-lapack-lib="${OPENBLAS_DIR}/lib/libopenblas.so" \
  --with-fortran-bindings=0 \
  --with-scalar-type=real \
  --with-debugging=0 \
  --download-mumps \
  --download-parmetis \
  --download-metis \
  --download-hypre \
  --download-superlu \
  --download-slepc \
  --download-hpddm \
  --download-ptscotch \
  --download-suitesparse \
  --download-scalapack \
  --download-tetgen \
  --download-mmg \

where the variables OPENBLAS_DIR and MPI_DIR were detected before from Spack.

The PETSc configuration and compilation seem to have completed successfully, as you can see in the configure.log (I’ve zipped the file due to the 8MB limit).

FreeFEM also compiles correctly, and the parallel version (ff-mpirun) runs fine, however, it’s still missing several plugins. This is the script for the configure:

  MPICC=$(which mpicc) \
  MPICXX=$(which mpicxx) \
  --without-hdf5 \
  --with-petsc=${PETSC_VAR}/lib

From the config.log, it seems that these packages are not detected. Do I need to manually provide their paths (e.g., --with-mshmet=...) during configuration, or is there another recommended way to include them?

config.log (417.1 KB)
configure.zip (481.6 KB)

It seems FreeFEM is properly detecting everything that you compiled/installed via PETSc. The rest can be compiled directly in the 3rdparty folder (via make), and then you should just ./reconfigure and it should detect the newly compiled packages. Send the updated config.log if not.

Do you mean like this?

./configure \
  MPICC=$(which mpicc) \
  MPICXX=$(which mpicxx) \
  --without-hdf5 \
  --with-petsc=${PETSC_VAR}/lib \

cd ${FF_DIR}/3rdparty
make
cd ${FF_DIR}
./reconfigure
make

If it is correct, make in the 3rdparty directory seems to do nothing as it says:

Making all in blas
make[1]: Entering directory '/u/speroni/FreeFem-sources/3rdparty/blas'
make  all-am
make[2]: Entering directory '/u/speroni/FreeFem-sources/3rdparty/blas'
make[2]: Nothing to be done for 'all-am'.
make[2]: Leaving directory '/u/speroni/FreeFem-sources/3rdparty/blas'
make[1]: Leaving directory '/u/speroni/FreeFem-sources/3rdparty/blas'
Making all in arpack
make[1]: Entering directory '/u/speroni/FreeFem-sources/3rdparty/arpack'
make[1]: Nothing to be done for 'all'.
make[1]: Leaving directory '/u/speroni/FreeFem-sources/3rdparty/arpack'
Making all in umfpack
make[1]: Entering directory '/u/speroni/FreeFem-sources/3rdparty/umfpack'
make[1]: Nothing to be done for 'all'.
make[1]: Leaving directory '/u/speroni/FreeFem-sources/3rdparty/umfpack'
make[1]: Entering directory '/u/speroni/FreeFem-sources/3rdparty'
make[1]: Nothing to be done for 'all-am'.
make[1]: Leaving directory '/u/speroni/FreeFem-sources/3rdparty'

I’ve seen that, for example, if I move in the mmg folder and type make it seems to compile in the right way. But I hope that there is a more automatic way to do that.

I am sorry for bothering you, but am I missing something?

Why would you go into the mmg folder since it’s already compiled by PETSc?

You are right, I was just trying. But the problem is still there

What problem? What is the package you want which is not installed?

What problem?

The main issue I’m facing is that none of the external packages appear to be loadable.

I tried recompiling everything from scratch, and also including the ./reconfigure step after building the content in the 3rdparty folder, following this procedure:

cd ${FF_DIR}/3rdparty
make
cd ${FF_DIR}
./reconfigure
make

Then, I tested several packages by writing simple hello-world scripts that load the corresponding plugin. The result is always the same. For instance, for mshmet, I get:

Load error: mshmet
	 fail: 
 dlerror : ./mshmet.so: cannot open shared object file: No such file or directory
list prefix: '' './' list suffix: '' , '.so' 
  current line = 2
Load error : mshmet
	line number :2, mshmet
error Load error : mshmet
	line number :2, mshmet

This error appears for all external plugins I tried (including mmg3d, tetgen, ff-Ipopt, etc.). I also tried running FreeFEM in parallel with ff-mpirun and loading PETSc, but the result doesn’t change.

I’ve attached the config.log file with the latest ./reconfigure for reference.
config.log (418.0 KB)

What is the package you want which is not installed?

I can try to provide a complete list if needed, but I’ve always worked with the “full” version of FreeFEM++ where most external packages were available by default — I’ll go through everything and make a list if it helps.

Thanks again for the support!

I don’t understand, here Trouble installing FreeFEM with full plugin support on HPC cluster - #8 by GiacomoSperoni97 you said that ff-mpirun was running OK. It is running, but as soon as you load a plugin, it fails?

Yes, it runs fine. I get the load error when I include the external package, as for the serial case

What is the result of cd examples/hpddm && make check?

This is the result:

PASS: withPartitioning.edp
PASS: buildRecursive.edp
PASS: PartitionCreate.edp
PASS: DmeshReconstruct.edp
PASS: convect.edp
PASS: diffusion-substructuring-2d.edp
PASS: diffusion-substructuring-withPartitioning-2d.edp
PASS: elasticity-2d.edp
PASS: elasticity-substructuring-2d.edp
PASS: elasticity-block.edp
PASS: heat-2d.edp
PASS: heat-io-2d.edp
PASS: heat-3d.edp
PASS: helmholtz-2d.edp
PASS: helmholtz-mg-2d.edp
PASS: iterative.edp
PASS: maxwell-3d.edp
PASS: heat-torus-3d-surf.edp
SKIP: diffusion-2d.edp
SKIP: diffusion-mg-2d.edp
SKIP: diffusion-3d.edp
SKIP: diffusion-simple-3d.edp
SKIP: diffusion-periodic-2d.edp
SKIP: elasticity-3d.edp
SKIP: elasticity-simple-3d.edp
SKIP: stokes-2d.edp
SKIP: stokes-3d.edp
SKIP: stokes-io-3d.edp
PASS: DmeshRedistribute_wo_PETSc.edp
PASS: bratu-2d-PETSc.edp
PASS: diffusion-2d-PETSc.edp
PASS: diffusion-cartesian-2d-PETSc.edp
PASS: diffusion-3d-PETSc.edp
PASS: diffusion-periodic-2d-PETSc.edp
PASS: diffusion-periodic-balanced-2d-PETSc.edp
PASS: elasticity-2d-PETSc.edp
PASS: elasticity-3d-PETSc.edp
PASS: elasticity-SNES-3d-PETSc.edp
PASS: heat-2d-PETSc.edp
PASS: laplace-lagrange-PETSc.edp
PASS: natural-convection-fieldsplit-2d-PETSc.edp
PASS: neo-Hookean-2d-PETSc.edp
PASS: newton-2d-PETSc.edp
PASS: newton-adaptmesh-2d-PETSc.edp
PASS: newton-vi-2d-PETSc.edp
PASS: newton-vi-adaptmesh-2d-PETSc.edp
PASS: block-PETSc.edp
PASS: laplace-RT-2d-PETSc.edp
PASS: laplace-RT-3d-PETSc.edp
PASS: stokes-2d-PETSc.edp
PASS: stokes-fieldsplit-2d-PETSc.edp
PASS: stokes-block-2d-PETSc.edp
PASS: MatLoad-PETSc.edp
PASS: stokes-3d-PETSc.edp
PASS: transpose-solve-PETSc.edp
PASS: diffusion-hpddm-2d-PETSc.edp
PASS: diffusion-hpddm-3d-PETSc.edp
PASS: bratu-hpddm-2d-PETSc.edp
PASS: vi-2d-PETSc.edp
PASS: orego-TS-PETSc.edp
PASS: heat-TS-2d-PETSc.edp
PASS: heat-TS-RHS-2d-PETSc.edp
PASS: advection-TS-2d-PETSc.edp
PASS: minimal-surface-Tao-2d-PETSc.edp
PASS: maxwell-2d-PETSc.edp
PASS: maxwell-3d-PETSc.edp
PASS: diffusion-mg-2d-PETSc.edp
PASS: diffusion-mg-3d-PETSc.edp
PASS: Dmesh-Save-Load.edp
PASS: navier-stokes-2d-PETSc.edp
PASS: oseen-2d-PETSc.edp
PASS: DMPlex-PETSc.edp
PASS: PtAP-2d-PETSc.edp
PASS: restriction-2d-PETSc.edp
PASS: function-PETSc.edp
PASS: bilaplace-2d-PETSc.edp
PASS: toy-Tao-PETSc.edp
PASS: elasticity-block-hpddm-2d-PETSc.edp
PASS: stokes-block-hpddm-2d-PETSc.edp
PASS: stokes-fieldsplit-3d-PETSc.edp
PASS: DmeshRedistribute_w_PETSc.edp
PASS: Schur-complement-PETSc.edp
PASS: transfer.edp
SKIP: laplace-adapt-3d-PETSc.edp
SKIP: stokes-adapt-3d-PETSc.edp
PASS: distributed-parmmg.edp
SKIP: laplace-adapt-dist-3d-PETSc.edp
PASS: laplace-2d-SLEPc.edp
PASS: laplace-spherical-harmonics-2d-SLEPc.edp
PASS: laplace-torus-2d-SLEPc.edp
PASS: schrodinger-harmonic-oscillator-1d-SLEPc.edp
PASS: schrodinger-square-well-1d-SLEPc.edp
PASS: schrodinger-axial-well-2d-SLEPc.edp
PASS: schrodinger-harmonic-oscillator-2d-SLEPc.edp
PASS: laplace-beltrami-3d-surf-SLEPc.edp
PASS: laplace-beltrami-3d-line-SLEPc.edp
PASS: stokes-2d-SLEPc.edp
PASS: mf-2d-SLEPc.edp
SKIP: diffusion-2d-PETSc-complex.edp
SKIP: helmholtz-2d-PETSc-complex.edp
SKIP: helmholtz-mg-2d-PETSc-complex.edp
SKIP: maxwell-mg-3d-PETSc-complex.edp
SKIP: laplace-2d-SLEPc-complex.edp
SKIP: navier-stokes-2d-SLEPc-complex.edp
SKIP: helmholtz-2d-SLEPc-complex.edp
SKIP: nonlinear-2d-SLEPc-complex.edp
SKIP: blasius-stability-1d-SLEPc-complex.edp
SKIP: helmholtz-3d-surf-PETSc-complex.edp
SKIP: helmholtz-3d-line-PETSc-complex.edp
SKIP: helmholtz-coupled-2d-PETSc-complex.edp
SKIP: helmholtz-dense-3d-line-PETSc-complex.edp
SKIP: maxwell-3d-surf-PETSc-complex.edp
============================================================================
Testsuite summary for FreeFEM 4.15
============================================================================
# TOTAL: 112
# PASS:  85
# SKIP:  27
# XFAIL: 0
# FAIL:  0
# XPASS: 0
# ERROR: 0
============================================================================

It skipped the ones that would have given him an error (PETSc seems to be loadable)

So the installation is functional. You did not go through the full tutorial, in particular, you missed the last crucial step which is mandatory if you use FreeFEM outside of its directories.

You mean this steps?

> export PATH=${PATH}:${FF_DIR}/src/mpi
> export PATH=${PATH}:${FF_DIR}/src/nw
◦ setup ~/.freefem++.pref or define FF_INCLUDEPATH and FF_LOADPATH