Global solution with PETSc

For verification purposes I would like to compare a solution computed in parallel with the sequentially computed one.

As a basis for an minimal working example I took an example from Pierre Jolivet’s tutorial and tried to reduce the distributed local solutions to a global one. I tried to accomplish this in the same way as it was done in the maxwell-3d-PETSc.edp example:

load "PETSc"
load "medit"

macro dimension()2// EOM
include "macro_ddm.idp"

real c = 6.25;
int N  = 80;

mesh M;
{
    border a(t=0,1) { x=t;   y=0;   label=1; };
    border b(t=0,1) { x=1;   y=t;   label=1; };
    border c(t=0,1) { x=1-t; y=1;   label=1; };
    border d(t=0,1) { x=0;   y=1-t; label=1; };
    M = buildmesh(a(N)+b(N)+c(N)+d(N));
}

mesh ThPlt = M;

load "Element_P3"
func Pk = P3;
Mat J;

int[int] n2o;
macro ThN2O()n2o// EOM

createMat(M, J, Pk)

fespace Vh(M, Pk);
Vh u;
func BC = cos(pi*x)*cos(pi*y);
varf vInit(w, v) = on(1, w = BC);
varf vJ(w, v) = int2d(M)(dx(w)*dx(v) + dy(w)*dy(v) - c*exp(u)*w*v) + on(1, w = 0);
varf vRes(w, v) = int2d(M)(dx(u)*dx(v) + dy(u)*dy(v) - c*exp(u)*v) + on(1, w = u);
int it = 1;
real tol = 1.0e-6;
u[] = vInit(0, Vh, tgv = -1);
real[int] b = u[];
for(int i = 1; i < it; ++i) {
    real[int] v = vRes(0, Vh, tgv = -1);
    v -= b;
    real res = sqrt(J(v, v));
    if(mpirank == 0)
        cout << "Norm of the residual = " << res << endl;
    if(res < tol)
        break;
    matrix Loc = vJ(Vh, Vh, tgv = -1);
    J = Loc;
    real[int] inc = J^-1 * v;
    u[] -= inc;
    plotD(M, u, cmm = "Solution at iteration #" + i);
}
plotD(M, u, cmm = "Final solution");



if(!NoGraphicWindow) {
    real[int] sol;
    changeNumbering(J, u[], sol);
    changeNumbering(J, u[], sol, inverse = true);
    fespace WhPlt(ThPlt, Pk);
    WhPlt plt;
    WhPlt reduce;
    cout << mpirank << " " << u[].n << " " << sol.n << " " << Vh.ndof << " " << WhPlt.ndof << " " <<endl;
    int[int] rest = restrict(Vh, WhPlt, n2o);
    for[i, v : rest] plt[][v] = sol[i];
    mpiReduce(plt[], reduce[], processor(0, mpiCommWorld), mpiSUM);
    if(mpirank == 0)
        medit("Global solution", ThPlt, reduce); // To keep this example short I kept medit here instead of saving the solution or calculating the difference to a reference solution
}

But the restrict(?) results in a segfault:
[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range

I would appreciate it if someone would point out my mistake.

You made two small mistakes which explain the segfault. Here is the fix:

  1. macro ThN2O()n2o// EOM should read macro MN2O()n2o// EOM because your mesh name is M, not Th as in maxwell-3d-PETSc.edp
  2. changeNumbering(J, u[], sol); goes from FreeFEM to PETSc numbering, then changeNumbering(J, u[], sol, inverse = true); goes from PETSc to FreeFEM numbering. You want to look at the values of u[], not sol. for[i, v : rest] plt[][v] = sol[i]; is wrong and should indeed be for[i, v : rest] plt[][v] = u[][i];

Let me know if I can be of any further assistance.