Error with Nested matrix using PETSc with Local Communicator

Hello all,

I am using PETSc with a Local Communicator. Below I give you an example of code (random problem just to show the problem)

verbosity=0;
mpiComm commGlobal(mpiCommWorld,0,0);

load "PETSc"
load "gmsh";

macro dimension() 2 // EOM  2 for 2d 	3 for 3d
macro ThKComm()commLocal // Local communicator or the mesh ThK
include "macro_ddm.idp"

for (int k=0; k<4; k++) //loop on the coarse element
{
bool elemtest =(mpirank==k%mpisize);
 if (elemtest)
 {
  mpiComm  commLocal(mpiCommWorld,mpirank,0);// MPI_Comm_split //definition of the local communicator
  mesh ThK=square(10+k,10+k);

 //********************************************************************************
 func PkX=P2;
 func PkM=P1;
 func PkX2 = [PkX, PkX];
 //**********************************************************************************
 fespace VhK(ThK, PkX2); //FE space to solve the local problem
 fespace PhK(ThK, PkM);

 //Variational formulation  //random problem
 varf vA([u,v],[uu, vv])=int2d(ThK)((dx(u)*dx(uu)+dy(u)*dy(uu)+dx(v)*dx(vv)+dy(v)*dy(vv))) + on(1,2,3,u=0,v=0) + on(4, u=1, v=0);
 varf vB([p], [uu,vv])= int2d(ThK)(-p*(dx(uu)+dy(vv)));

 //Creation of the corresponding FreeFEM matrices
 matrix A=vA(VhK,VhK);
 matrix B=vB(PhK,VhK);

//Creation of the corresponding matrices in PETSc
Mat Ap, Cp;
{
  buildDmesh(ThK)
  {
    macro def(u)[u, u #B] // dont touch
    macro init(u)[u, u] // dont touch
    createMat(ThK, Ap, PkX2)
  }
  createMat(ThK, Cp, PkM)
}
Mat Bp(Ap, Cp, B); // Bp = B
Ap = A;  //Ap=A

Mat N=[[Ap, Bp],
   [Bp',0]];   // Creation of a Nested Matrix


// RHS vector
real[int]  vel(VhK.ndof) ;
vel= 1;

real[int] press(PhK.ndof);
press=0;

real[int] rhsPETSc;
real[int] velPETSc;
real[int] pressPETSc;

ChangeNumbering(Ap, vel, velPETSc);
ChangeNumbering(Cp, press, pressPETSc);
rhsPETSc = [velPETSc, pressPETSc];

//real[int] xx1= Ap^-1 *velPETSc;  // this operation work
real[int] xx3= N^-1 *rhsPETSc;   //This operation does not work

cout << "element " << k << " treated " << " with processor " << mpirank <<  endl;

}
}

I run this code with

mpiexec -np 4 FreeFem++-mpi mycode.edp

The operation real[int] xx1= Ap^-1 *velPETSc; works perfectly.

However when I do the same with the nested matrix N, real[int] xx3= N^-1 *rhsPETSc; a problem of dimension occurs. Indeed, the error for one processor is for example example :

Submatrix dimension (144,1058) incompatible with nest block (630,1058)

I don’t know where the size 630 comes from. It seems that something is replicate 4 times.

Could someone help me to build correctly the nested matrix N ?

Thank you in advance,

Best regards,

Loïc,

Your script is correct, but there was no support for MatNest living on a local communicator. I have fixed this in Adds support for MatNest on subcommunicators. · FreeFem/FreeFem-sources@63cb3d3 · GitHub. Thank you for reporting this.

Hello @prj

Thank you for your help,
I will try this and let you know,

Best regards,

Loïc,

Thanks @prj ,

My code now is working,

Best regards,

Loïc,