High memory usage in vector space problems using PETSc

Hi,

During the development of simple Stokes and Navier-Stokes solvers using the PETSc interface in FF, I noticed that the suggested combination of GMRES KSP type and LU preconditioner (with MUMPS) utilizes a tremendous amount of memory on not-so-big problems.

This can be seen using the stokes-3d-PETSc.edp example. If I run it using -global 45 flag, the mesh will have ~350,000 elements, i.e. a DOF of ~1,600,000, but (if I’m right) it requires more than 50 GB of memory to be solved.

Is this a bug? If not, is there not any other suggestion for the KSP and preconditioner of Stokes equation that requires less memory?

I don’t think it’s a bug, memory scaling of exact factorizations is usually terrible. For Stokes and Navier–Stokes, your best bet is to use a fieldsplit preconditioner, see this example + this tutorial + section 2.3.5 of the PETSc manual.

Thanks @prj for your help. I had already tried the fieldsplit preconditioner via the provided example, but it didn’t have a good performance. The configuration you have in your tutorial works much better in my opinion, and I’m going to use that configuration in my solver.