SNES with multigrid for the Jacobian

Hello Everyone,

I am trying to implement a PETSc simple multigrid method for inverting the Jacobian in SNES. However, even after examining the available examples, I am struggling to figure this out. I am attaching an MWE which is a modification of an example FreeFEM script.

Any help is appreciated!
navier-stokes-2d-PETSc_prec_MWE.edp (3.9 KB)

What is the problem?

Well, the code I attached produces the following error, and I have no idea why. My guess is that the options I am trying to pass to the matrices are not registered.

[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: PCASMSetLocalSubdomains() should be called before calling PCSetUp().
[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-nw value: navier-stokes-2d-PETSc_prec_MWE.edp source: command line
[0]PETSC ERROR:   Option left: name:-v value: 0 source: command line
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: PETSc Development Git Revision: v3.22.3-387-g1406e4c8e14 Git Date: 2025-02-18 22:08:51 +0000
[0]PETSC ERROR: /home/andrasz/freefem/FreeFem-sources/src/mpi/FreeFem++-mpi with 4 MPI process(es) and PETSC_ARCH arch-FreeFem on cnre-ws-11 by andrasz Wed Feb 26 10:06:41 2025
[0]PETSC ERROR: Configure options: --download-mumps --download-parmetis --download-metis --download-hypre --download-superlu --download-slepc --download-hpddm --download-ptscotch --download-suitesparse --download-scalapack --download-tetgen --with-fortran-bindings=no --with-scalar-type=real --with-debugging=no
[0]PETSC ERROR: #1 PCASMSetLocalSubdomains_ASM() at /home/andrasz/freefem/petsc/src/ksp/pc/impls/asm/asm.c:738
[0]PETSC ERROR: #2 PCASMSetLocalSubdomains() at /home/andrasz/freefem/petsc/src/ksp/pc/impls/asm/asm.c:946
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

I tried to reverse engineer the solution from examples (1 and 2), but I could not figure it out. I do not understand how options are passed to the KSP object that is present within SNES (or EPS), and this is what I hope someone can help me with.

Why are you specifying the O parameters in your set() calls?

No idea - I saw this in the maxwell mulitrig example. If I do not specify it, I get the following error:

  0 SNES Function norm 1.183215956620e+01
  0 SNES Function norm 1.183215956620e+01
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

Does your code run without multigrid? Preconditioning should be your last concern.

Yes, if I specify set(J[0], sparams = " -ksp_type preonly -pc_type lu -pc_factor_mat_solver_type mumps "); . the code the same way the original one: FreeFem-sources/examples/hpddm/navier-stokes-2d-PETSc.edp at develop · FreeFem/FreeFem-sources · GitHub

OK, the next question is then, why do you want to use PCMG here?

I hear about using the LU factorization with a lower discretization order can be used as a preconditioner for an iterative solver. Like a geometric multigrid, instead of a h-type (lower mesh resolution with same discretization order) a p-type (same mesh, lower discretization order). I want to try it out, and the PETSc PCMG seems the natural way to do it (also, I tried to apply the PtA^-1P multiplication as a preconditioner, but it did not work).

Usually, you want to refine as you decrease the discretization order (that’s called low-order-refined, LOR). Have you checked that your restriction operator is correct (matrix P[0])?
You should put the set outside of the func, but you’ll still get the same result in the end, your preconditioner is ill-posed (and so GMRES doesn’t converged and so SNES does not either).

Thanks for the answer. There were indeed a bug in my code (wrong coarse element space which is ill conditioned).

However, what I found odd is that even if I use the exact same commands as in the FreeFem-sources/examples/hpddm/helmholtz-mg-2d-PETSc-complex.edp at develop · FreeFem/FreeFem-sources · GitHub example, I do not get ant output with KSP Residual norm, although the -ksp_view forces the KSP object to be displayed. The KSP object seems to have the options I specified, yet as if nothing happens. I attach the output of the code.
ns.log (8.7 KB)

Run with -ksp_converged_reason. Before doing a nonlinear solver, I would start by looking at something simpler, like Stokes or Oseen.

1 Like

Using -ksp_converged_reason, I got the following error:
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to SUBPC_ERROR

I will try out the preconditioner on a simpler similar system as you suggested.