From FreeFem matrix to Petsc Mat, double size of memory usage?

In parallel study using PETSc, the examples in the directory examples/hpddm usually first generate the FreeFem matrix like

matrix Aloc = vA(Vh, Vh);

“Aloc“, and then assign its value to a PETSc Mat like

Mat A = Aloc;

When the problem size is very big, this means twice memory usage.

One alternative solution is to directly calculate the PETSc Mat like

Mat A = vA(Vh, Vh);

This works for the case that all the Dirichlet boundary conditions can be enforced using the “on“ operator, but fails in the case pointwise Dirichlet condition exists.

So, when dealing with the pointwise Dirichlet condition, it seems the above two steps (first a FreeFem matrix, and then a PETSc Mat) is unavoidable.

Another way maybe first calculating the PETSc Mat without pointwise Dirichlet condition, and then using PETSc functions like “MatZeroRows“ to implement this pointwise Dirichlet condition. But this seems not supported by the current FreeFem++ code.

Anyone has other solutions? To solve the problem without twice memory usage.

The cost of storing a sparse matrix is usually orders of magnitude lower than for storing a preconditioner (unless you are solving something trivial with a Jacobi preconditioner). That being said, you can free the memory used by the FreeFEM matrix by simply doing:
{ matrix empty(0, 0); Aloc = empty; }

Then, there is no additional cost.