Pointwise Dirichlet constraint using setBC is not suitable for a PETSc parallel environment?

Dear FreFem++ community and developers:
The “setBC” function for enforcing Dirichlet boundary condition is very useful in solid mechanics field, just like that shown in the examples/3d/Elasticity-simple-support-BC.edp.
I have tested several tutorials using this feature, and it works well in the sequential cases.
But when I turn to a parallel version, especially using the macro_ddm.idp and PETSc, it is sometimes not stable enough.
The error is “Assertion fail: (IA[k]-fortran>=0 ) && (IA[k]-fortran<=nn), from line 1009 of HashMatrix.cpp”, something like that the index number is not within the range between its minimum and maximum values.

A typical template code is as follows:

load “PETSc”
macro dimension() 3 //
include “macro_ddm.idp”

func Pk3 = [P2, P2, P2];
macro def(u) [u, u#B, u#C] //
macro init(u) [u, u, u] //

// Mesh
mesh3 Th = readmesh3(“./mesh/mesh.msh”);

// Domain decomposition
Mat A;
DmeshCreate(Th);
MatCreate(Th, A, Pk3);

// FE space
fespace Vh(Th, Pk3);
Vh [u1, u2, u3];

// Variational formulation
varf vA([u1, u2, u3], [v1, v2, v3])
= int3d(Th)(epsV(u1, u2, u3)’ * C * epsV(v1, v2, v3));

varf vb([u1, u2, u3], [v1, v2, v3])
= int3d(Th)(rho * (gravity’ * [v1, v2, v3]));

varf vbp([u1, u2, u3], [v1, v2, v3])
= on(nDr0, u1 = 1, u2 = 1, u3 = 1);

// Matrix form
matrix Aloc = vA(Vh, Vh);
real[int] b = vb(0, Vh);

// Dirichlet boundary condition
real[int] bp = vbp(0, Vh, tgv = 1);

// Enforce Dirichlet boundary condition
setBC(Aloc, bp, -2);
b = bp ? 0 : b;

// PETSc matrix
A = Aloc;

// Solve problem
set(A, sparams = “-pc_type hypre”);
u1 = A^-1 * b;

The error usually occurs at the line “setBC(Aloc, bp, -2);” or “A = Aloc;”.

Is it because in the local problem in mpi process i, the dof id in the array “bp“ exceeds the limits of the dof index set of local mesh?

Here, I didn’t use the “on“ operator to enforce the Dirichlet boundary condition, because in the case of pointwise constraint, the “on“ operator fails, and only the “setBC“ operator works.

Your code is not copy/paste’ble, so I can’t help. There is no issue with setBC() and PETSc, so the problem is in your code.

For good measure, here is a slight modification of FreeFem-sources/examples/hpddm/elasticity-3d-PETSc.edp at develop · FreeFem/FreeFem-sources · GitHub which shows that what you want to do works:

//  run with MPI:  ff-mpirun -np 4 script.edp
// NBPROC 4

load "PETSc"                        // PETSc plugin
macro dimension()3// EOM            // 2D or 3D
include "macro_ddm.idp"             // additional DDM functions

macro def(i)[i, i#B, i#C]// EOM     // vector field definition
macro init(i)[i, i, i]// EOM        // vector field initialization
real Sqrt = sqrt(2.0);
macro epsilon(u)[dx(u), dy(u#B), dz(u#C), (dz(u#B) + dy(u#C)) / Sqrt, (dz(u) + dx(u#C)) / Sqrt, (dy(u) + dx(u#B)) / Sqrt]// EOM
macro div(u)(dx(u) + dy(u#B) + dz(u#C))// EOM
func Pk = [P1, P1, P1]; // finite element space


int[int] LL = [2,3, 2,1, 2,2];
mesh3 Th = cube(10 * getARGV("-global", 5), getARGV("-global", 5), getARGV("-global", 5), [10 * x, y, z], label = LL);
Mat A;
macro ThRefinementFactor()getARGV("-split", 1)//
MatCreate(Th, A, Pk);

real f = -9000.0;
real strain = 100.0;
real Young = 1.0e8;
real poisson = 0.45;
real tmp = 1.0 + poisson;
real mu = Young  / (2.0 * tmp);
real lambda = Young * poisson / (tmp * (1.0 - 2.0 * poisson));
varf vPb(def(u), def(v)) = int3d(Th)(lambda * div(u) * div(v) + 2.0 * mu * (epsilon(u)' * epsilon(v))) + int3d(Th)(f * vC); // '
varf vbp([u1, u2, u3], [v1, v2, v3]) = on(1, u1 = 1, u2 = 1, u3 = 1);
fespace Wh(Th, Pk);                 // local finite element space
real[int] bp = vbp(0, Wh, tgv = 1);
matrix Loc = vPb(Wh, Wh);
real[int] rhs = vPb(0, Wh);
setBC(Loc, bp, -2);
rhs = bp ? 0 : rhs;
Wh<real> def(u);                    // local solution

A = Loc;

macro def1(u)u// EOM

Wh<real> def(Rb)[6];
[Rb[0], RbB[0], RbC[0]] = [1, 0, 0];
[Rb[1], RbB[1], RbC[1]] = [0, 1, 0];
[Rb[2], RbB[2], RbC[2]] = [0, 0, 1];
[Rb[3], RbB[3], RbC[3]] = [y, -x, 0];
[Rb[4], RbB[4], RbC[4]] = [-z, 0, x];
[Rb[5], RbB[5], RbC[5]] = [0, z, -y];
set(A, sparams = "-pc_type gamg -ksp_type gmres -ksp_max_it 200 -pc_gamg_threshold 0.01", nearnullspace = Rb);
u[] = A^-1 * rhs;
plotMPI(Th, u, P1, def1, real, cmm = "Global solution");
real alpha = 1.0;
mesh3 ThMoved = movemesh3(Th, transfo = [x + alpha * u, y + alpha * uB, z + alpha * uC]);
u[] = mpirank;
plotMPI(ThMoved, u, P1, def1, real, cmm = "Global moved solution");

Dear Prof. Jolivet,

Thanks for your kind reply. I will check my code with your provided example elasticity-3d-PETSc.edp.
Besides, if the problem is still not solved, I will add a minimal working example code as soon as possible.

Best regards.

Dear Prof. Jolivet,

I checked my code and read the above tutorial provided by you. The “setBC“ works well when I conduct a stationary study, that is, the PETSc solution “u = A^-1 * rhs” only need to be solved once.

But, when I conduct a transient study using the backward Euler algorithm, the problem still exists sometimes, although not all the time. A frequently occurred case is that the simulation works well at the first several time steps, and fails at a random time instant accompanied by the error:
Assertion fail : ((IA(k)-fortran>=0 ) && (IA(k)-fortran<=nn ) ). err code 6, mpirank 16

I tried to change the PETSc preconditioner from hypre to gamg, and the result is sometimes improved, but the error is still observed once. Sometimes is due to “setBC(Aloc, A, -2)“ line, and sometimes is due to “A = Aloc“ line.

Is it because the transient study need many times matrix & vector operations, and the stability of the FreeFem code is not robust enough? Especially when transforming from a FreeFem matrix to a PETSc Mat.

No, there is no issue of stability, again, the problem is in your code. Share more details because it’s not possible to help you otherwise.

Dear Prof. Jolivet,

Thanks for your reply, and I have sent a minimal working example to your email. Please check it.

Best regards.

This problem is partially solved when the FreeFem++ is updated to 4.15 version.