Iterative solution of the Navier-Stokes equation

Dear FreeFem users!

I am interested in solving (a slightly modified form of the) Navier-Stokes equation, and I would like to test whether LU factorization or iterative solvers perform better. I have found some issues which I could not figure out. These issues can be reproduced on the slightly modified PETSc stationary NS equations example.

  1. Changing line 56 to
    set(J, sparams = "-ksp_monitor -ksp_monitor_true_residual -ksp_type fgmres");
    results in a divergent iteration with the following message:
    0 KSP unpreconditioned resid norm 1.183215956620e+01 true resid norm -nan ||r(i)||/||b|| -nan
    Although this method is far from optimal and I will not use it, it annoys me that I cannot figure out the reason for this. Any ideas?
  2. Using the modified augmented lagrangian preconditioner method, if the problem is solved inside a for loop, the first time the loop is entered, it is executed correctly. However, as SNESSolve is called the second time the loop is entered, the program throws an error. I attach the slightly modified version of the Navier-Stokes example (kind of an MWE), which reproduces this error. Any ideas on how this issue can be resolved?
    navier-stokes-2d-PETSc-iter-mAL-MWE.edp (4.6 KB)

Thanks for the help in advance
Andras

  1. By default, PETSc will use a block Jacobi method with ILU(0) subdomain solver. This fails with the zero pressure block. Just add a penalization term in your Jacobian varf, e.g., 1e-8 * p * q, to make your sparams work
  2. Use the ChangeSchur function to update the array of Schur complements and do not reset the PCFIELDSPLIT fields every time you are in the loop

Thanks for the quick reply!

I tried to figure out how the ChangeSchur function work based on the PETSc-code.hpp file. My solution is the following:

    if(i==0){
        set(J, sparams = params, fields = vX[], names = names, schurPreconditioner = S, schurList = listX[]);
    } else {
        ChangeSchur(J,S,listX[]);
    } 

However, this yields the following error message:

[0]PETSC ERROR: #1 PCCompositeGetPC() at /builds/workspace/FreeFEM-sources-deployDEB-withPETSc/3rdparty/ff-petsc/petsc-3.15.0/src/ksp/pc/impls/composite/composite.c:589
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see PETSc: Documentation: FAQ
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: #2 User provided function() at unknown file:0
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.

I would be glad if you could help what causes this issue. I am running FreeFem++ version 4.9 installed from the .deb package, on Ubuntu 18.04.5 LTS bionic.

Thanks, this is fixed in No PCCompositeGetPC with a single Schur Pmat. · FreeFem/FreeFem-sources@7cd3bd6 · GitHub. If you are sticking to precompiled binaries, I’d just advice not to update the Schur complement preconditioner for i > 0, that should still converge somehow reasonably well (I think).

1 Like

Thank you very much for the fix!