Poisson equation with lagrange multiplier using PETSc

Dear FreeFem++ users,

I would like to solve a Poisson problem on a square under the constraint that the solution is zero on a disk (contained inside the square) using PETSc. To do that I use a Lagrange multiplier defined on a submesh that includes the disk. Therefore, this is a saddle-point problem and the finite element matrix of the system is of the form

M = [[A, B],[B',0]]

where A is a square symmetric positive-definite matrix and B a rectangular matrix.

It works well using FreeFem++ matrices (see [1]) and I would like to do the same with PETSc matrices in order to use preconditioned iterative solvers (by the way, if there is a way to do that with the classical FreeFem++ matrices I am also interested).

The file [2] is an attempt to solve this problem with PETSc.
I encounter an issue with the LU solver in PETSc and get the following error message:

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry 1681
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-nw value: laplace-lagrange-block-petsc.edp
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, Nov 28, 2022 
[0]PETSC ERROR: /usr/local/bin/FreeFem++-mpi on a  named prancingpony by fabien Fri Feb 24 20:39:08 2023
[0]PETSC ERROR: Configure options --prefix=/usr/local/ff-petsc/r --with-debugging=0 COPTFLAGS="-O3 -mtune=generic" CXXOPTFLAGS="-O3 -mtune=generic" FOPTFLAGS="-O3 -mtune=generic" --with-cxx-dialect=11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-cudac=0 --download-slepc --download-hpddm --download-slepc-commit=d9df183c341775ffc91510ec6827e13db44a25cd --with-cc=/usr/bin/mpicc --with-cxx=/usr/bin/mpic++ --with-fc=/usr/bin/mpif90 --with-scalar-type=real --with-blaslapack-include= --with-blaslapack-lib="-llapack -lblas" --download-metis --download-ptscotch --download-hypre --download-parmetis --download-superlu --download-suitesparse --download-tetgen --download-mmg --download-parmmg --download-slepc-configure-arguments=--download-arpack --download-scalapack --download-mumps PETSC_ARCH=fr
[0]PETSC ERROR: #1 MatLUFactorSymbolic_SeqAIJ() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/mat/impls/aij/seq/aijfact.c:300
[0]PETSC ERROR: #2 MatLUFactorSymbolic() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/mat/interface/matrix.c:3151
[0]PETSC ERROR: #3 PCSetUp_LU() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/pc/impls/factor/lu/lu.c:87
[0]PETSC ERROR: #4 PCSetUp() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #5 KSPSetUp() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:406
[0]PETSC ERROR: #6 KSPSolve_Private() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:825
[0]PETSC ERROR: #7 KSPSolve() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:1071
 --- system solved with PETSc (in 1.071712e-02)
 0x55f2f83f2b70 VTK_FILE 2
double free or corruption (!prev)
[prancingpony:74869] *** Process received signal ***
[prancingpony:74869] Signal: Aborted (6)
[prancingpony:74869] Signal code:  (-6)
[prancingpony:74869] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x14420)[0x7f10a32f5420]
[prancingpony:74869] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xcb)[0x7f10a2f3300b]
[prancingpony:74869] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x12b)[0x7f10a2f12859]
[prancingpony:74869] [ 3] /lib/x86_64-linux-gnu/libc.so.6(+0x8d26e)[0x7f10a2f7d26e]
[prancingpony:74869] [ 4] /lib/x86_64-linux-gnu/libc.so.6(+0x952fc)[0x7f10a2f852fc]
[prancingpony:74869] [ 5] /lib/x86_64-linux-gnu/libc.so.6(+0x96fac)[0x7f10a2f86fac]
[prancingpony:74869] [ 6] /usr/local/bin/FreeFem++-mpi(_ZdaPv+0xd)[0x55f2f67168bd]
[prancingpony:74869] [ 7] /usr/local/bin/FreeFem++-mpi(_Z7DestroyI2KNIdEE19AnyTypeWithOutCheckPvRKS2_+0x7b)[0x55f2f655311b]
[prancingpony:74869] [ 8] /usr/local/bin/FreeFem++-mpi(_ZNK10E_F0_Func1clEPv+0x43)[0x55f2f63977b3]
[prancingpony:74869] [ 9] /usr/local/bin/FreeFem++-mpi(_ZNK12vectorOfInst4evalEPvi+0x7d)[0x55f2f64b128d]
[prancingpony:74869] [10] /usr/local/bin/FreeFem++-mpi(_ZNK10ListOfInstclEPv+0x5bd)[0x55f2f64ac15d]
[prancingpony:74869] [11] /usr/local/bin/FreeFem++-mpi(_Z7lgparsev+0x5177)[0x55f2f6395117]
[prancingpony:74869] [12] /usr/local/bin/FreeFem++-mpi(_Z7Compilev+0x46)[0x55f2f6396836]
[prancingpony:74869] [13] /usr/local/bin/FreeFem++-mpi(_Z6mainffiPPc+0x50a)[0x55f2f639701a]
[prancingpony:74869] [14] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f10a2f14083]
[prancingpony:74869] [15] /usr/local/bin/FreeFem++-mpi(_start+0x2e)[0x55f2f6382efe]
[prancingpony:74869] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node prancingpony exited on signal 6 (Aborted).

My questions are the following:

  • How can I solve this constraint problem using PETSc matrices with a direct solver and obtain the same result as in [1] ?
  • What kind of iterative solver do you recommend for this particular problem ?

For reference, I have use the following FreeFem++ examples to write the file [2]:

[1] laplace-lagrange-block-ff.edp (1.7 KB)
[2] laplace-lagrange-block-petsc.edp (2.1 KB)

For starters, change the mesh in line 75 from createMat(Th,D,P1) to createMat(multmesh,D,P1), and your script runs to completion. I did not check that it matches the result from [1], but I don’t see why it wouldn’t.

Since block 0 is just the Poisson Equation, I think you should be able to use -pc_fieldsplit_type schur with a simple CG preconditioner on block 0.

You are right, I missed that. Thanks !

However, I still have an issue. If I run with

ff-mpirun -np 1 laplace-lagrange-block-petsc.edp

I get the following error on output:

'/usr/bin/mpiexec' -np 1 /usr/local/bin/FreeFem++-mpi -nw 'laplace-lagrange-block-petsc.edp' -ns
-- FreeFem++ v4.12 (Tue Dec  6 19:14:11 CET 2022 - git v4.12)
   file : laplace-lagrange-block-petsc.edp
 Load: lg_fem lg_mesh lg_mesh3 eigenvalue parallelempi 
 load: init metis (v  5 )
 sizestack + 1024 =18360  ( 17336 )

  -- Square mesh : nb vertices  =1681 ,  nb triangles = 3200 ,  nb boundary edges 160
  Plot::  Sorry no ps version for this type of plot 55
  -- FESpace: Nb of Nodes 52 Nb of DoF 52
  Plot::  Sorry no ps version for this type of plot 55
 --- partition of unity built (in 1.651500e-05)
 --- global numbering created (in 7.629000e-06)
 --- global CSR created (in 1.326870e-04)
 --- partition of unity built (in 1.542800e-05)
 --- global numbering created (in 1.515000e-06)
 --- global CSR created (in 3.556300e-05)
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry 1681
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-ns (no value)
[0]PETSC ERROR: Option left: name:-nw value: laplace-lagrange-block-petsc.edp
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, Nov 28, 2022 
[0]PETSC ERROR: /usr/local/bin/FreeFem++-mpi on a  named prancingpony by fabien Fri Feb 24 22:16:55 2023
[0]PETSC ERROR: Configure options --prefix=/usr/local/ff-petsc/r --with-debugging=0 COPTFLAGS="-O3 -mtune=generic" CXXOPTFLAGS="-O3 -mtune=generic" FOPTFLAGS="-O3 -mtune=generic" --with-cxx-dialect=11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-cudac=0 --download-slepc --download-hpddm --download-slepc-commit=d9df183c341775ffc91510ec6827e13db44a25cd --with-cc=/usr/bin/mpicc --with-cxx=/usr/bin/mpic++ --with-fc=/usr/bin/mpif90 --with-scalar-type=real --with-blaslapack-include= --with-blaslapack-lib="-llapack -lblas" --download-metis --download-ptscotch --download-hypre --download-parmetis --download-superlu --download-suitesparse --download-tetgen --download-mmg --download-parmmg --download-slepc-configure-arguments=--download-arpack --download-scalapack --download-mumps PETSC_ARCH=fr
[0]PETSC ERROR: #1 MatLUFactorSymbolic_SeqAIJ() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/mat/impls/aij/seq/aijfact.c:300
[0]PETSC ERROR: #2 MatLUFactorSymbolic() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/mat/interface/matrix.c:3151
[0]PETSC ERROR: #3 PCSetUp_LU() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/pc/impls/factor/lu/lu.c:87
[0]PETSC ERROR: #4 PCSetUp() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #5 KSPSetUp() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:406
[0]PETSC ERROR: #6 KSPSolve_Private() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:825
[0]PETSC ERROR: #7 KSPSolve() at /home/fh/ff++/3rdparty/ff-petsc/petsc-3.18.2/src/ksp/ksp/interface/itfunc.c:1071
 --- system solved with PETSc (in 1.415698e-03)
 0x558e13cb27a0 VTK_FILE 2
times: compile 1.485900e-01s, execution 2.605500e-02s,  mpirank:0
 ######## We forget of deleting   0 Nb pointer,   0Bytes  ,  mpirank 0, memory leak =214592
 CodeAlloc : nb ptr  9634,  size :737608 mpirank: 0
Ok: Normal End

Then, if I run the scrit with 2 proc:

ff-mpirun -np 2 laplace-lagrange-block-petsc.edp

it runs to completion but the sol vector is full of inf.

Do you have the same result ?

I am seeing the same error.

I would guess the problem has something to do with the fact that you haven’t partitioned your mesh using buildDmesh. I can get a converged result (after much longer than I would expect) if I use:
set(M,sparams=" -ksp_type fgmres -ksp_monitor -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_detect_saddle_point -fieldsplit_0_ksp_type cg");
But not if I use set(M,sparams="-pc_type lu");

Maybe someone else can offer better, more specific, guidance.

Why is it not working with 1 process but 2 processes?

On the one hand, by default, on a single process, PETSc uses its own implementation for computing LU factorization. It can be quite inefficient for large or difficult problems, without proper tuning. In particular, you need additional parameters to make it handle matrices with 0 along the diagonal, as written in the error message (that couldn’t be more clear in my opinion?). On the other hand, this implementation does not work for more than 1 process, and so with 2 processes, it automatically switches to MUMPS, which can handle, in theory, 0 along the diagonal. To stick to MUMPS even with one process, do -pc_factor_mat_solver_type mumps

Why don’t I get the proper solution?

If you run the code with -info, you’ll see in the log:

 On return from DMUMPS, INFOG(1)=              -9
 On return from DMUMPS, INFOG(2)=            7662
[0] <mat> MatFactorNumeric_MUMPS(): MUMPS in numerical factorization phase: INFOG(1)=-9, INFO(2)=7662, problem with workarray

If you look the MUMPS manual, you’ll see this can be fixed by adding the option -mat_mumps_icntl_14 100 (or some other large values). Now, you’ll get:

$  ff-mpirun -n 1 laplace-lagrange-block-petsc.edp -pc_factor_mat_solver_type mumps -mat_mumps_icntl_14 200 -ksp_monitor_true_residual -v 0
  0 KSP preconditioned resid norm 4.775939688955e+01 true resid norm 2.568843773633e-02 ||r(i)||/||b|| 1.000000000000e+00
  1 KSP preconditioned resid norm 7.865352110167e-10 true resid norm 1.619158105473e-11 ||r(i)||/||b|| 6.303061798043e-10

Thank you, both of you, for these helpful explanations ! It solved my problem.

1 Like