FreeFem - PETSc compilation error - libblas.a/liblapack.a cannot be used


I’m trying to compile FreeFem on a cray based hpc without any luck so far. I have compiled FreeFem multiple times already using this guide, but I have no experience in doing it on cray based systems.

The problem I’m running into is that PETSc can’t use libblas.a/liblapack.a no matter what I do. The main configure line I’m trying to run:

./configure --with-mpi-dir=/opt/cray/pe/mpich/8.1.27/ofi/cray/16.0/ --download-mumps  --download-parmetis --download-metis --download-hypre --download-superlu --download-slepc --download-hpddm --download-ptscotch --download-suitesparse --download-scalapack --download-tetgen --download-mmg --download-parmmg --with-fortran-bindings=no --with-scalar-type=real --with-debugging=no --download-bison --download-make --with-packages-download-dir=${PETSCBUILDETC_DIR}

The libraries are not present on the hpc I’m working on so at first I tried adding the --download-fblaslapack option to the configure line, but got an error at the end stating that the library cannot be used (attached configure.log file).

I tried looking for similar errors and found that the fortran compiler might be different for the libraries and PETSc so I tried compiling blas and lapack (tried both the netlib and openblas packeges) myself and added the cmake option -DCMAKE_Fortran_COMPILER=/opt/cray/pe/mpich/8.1.27/ofi/cray/16.0/bin/mpif90 so that the libraries would be compiled with the same fortran compiler as PETSc is and added the configure options --with-blas-lib=/home/p_mvgkm/p_mvgopt02/FreeFem/lapack/lib64/libblas.a (tried it with --with-blas-lib=/home/p_mvgkm/p_mvgopt02/FreeFem/blas/lib/libblas.a as well) and --with-lapack-lib=/home/p_mvgkm/p_mvgopt02/FreeFem/lapack/lib64/liblapack.a but got a similar error, stating the libraries cannot be used. (attached configure.log file)

I also had to remove --download-parmmg from the config line, because it makes the configure process freeze, it gives no error it is just stuck. (attached configure.log file)

Did somebody run into a similar issue, or does someone have any solution ideas in mind?

Another question I would like to ask, is that on cray based systems there are no mpirun or mpiexec commands, instead running things with srun does the same. Is there going to be someting I will have to change to use srun or does linking the cray-mpich compiler solve this issue.

Any help would be appriciated, thank you in advance!

Could you try --download-f2cblaslapack instead of --download-fblaslapack?

Thank you for your reply! I tried changing that option and the compiler went further, but stopped with an error when compiling ScaLAPACK (configure.log).
On a sidenote, I also saw this option but assumed, that it only works with --with-fc=0 as suggested on the PETSc site and when I ran the configure it didn’t start stating, that scalapack needs a fortran compiler. This new issue might be related to this.

It appears there is something not right with your Fortran compiler, or at least it is making PETSc configure fail, so I would just reconfigure --with-fc=0 --download-superlu_dist and without --download-scalapack --download-mumps.

I can configure PETSc with the options you suggested, thank you!

I will try talking with the support of the hpc, see if they can do something about the Fortran compiler and report back if the root of the issue is found.

I managed to configure and make the real part of PETSc, however I run into a issue while compiling the complex part of PETSc with the error: /home/p_mvgkm/p_mvgopt02/FreeFem/petsc/include/petscmath.h:429:45: error: unknown type name '__complex128' PETSC_EXTERN MPI_Datatype MPIU___COMPLEX128 MPIU___COMPLEX128_ATTR_TAG;.

Configure with the additional COPTFLAGS -DPETSC_SKIP_REAL___FLOAT128 and let me know if this fix the issue, please.

I tried configuring the complex part again with the added option, but got an error while running the make command.
I also tried reconfiguring the real part with the same option as well, but it could not be made with this option either.

Hm, could you try to remove manually #define PETSC_HAVE_REAL___FLOAT128 1 from petscconf.h and give a last shot?

This seems to have solved my issues, I could compile the complex part after deleting the line you suggested, and could also configure and make FreeFem. Thank you for your help!

I managed to compile the original version with the Fortran compiler, so if anybody would run into this issue in the future, the fix was using the --with-cc, --with-cxx and --with-fc options and giving them the absolute path of the cc, CC and ftn wrappers as on Cray systems these are already mpi ready wrappers. The fix deleting that line from the petscconf.h was still required.