How to set AttachCoarseOperator?

Hello, I am studying how to set the two-level GenEo algorithm by calling hpddm directly, and the code elasticity.edp in ~/examples/hpddm is related closely to my purpose. My problem is about the setting of AttachCoarseOperator, and the involved code is line 81-84 in this edp file:

varf vPbNoPen(def(u), def(v)) = int3d(Th)(lambda * div(u) * div(v) + 2.0 * mu * (epsilon(u)’ * epsilon(v))) + on(1, u = 0.0, uB = 0.0, uC = 0.0);
matrix noPen = vPbNoPen(Wh, Wh, sym = 1);
if(deflation == “geneo”)
AttachCoarseOperator(mpiCommWorld, A, A = noPen/, threshold = 2. * h[].max / diam,/, ret = ret);

The problem vPbNoPen defined here is different to the one in line 54, and the term int3d(Th)(f * vC) is not included here. Why does it work? And how to find the detailed usage of AttachCoarseOperator?

Thanks

The linear form is irrelevant for building the coarse space. Are you sure that you know what you are looking for?

My purpose is to solve a thermal stress problem of large size (~ 10^9 dof) with the two-level ddm algorithm (GenEo), but calling it via FFDDM is very memory consuming. It seems that calling hpddm directly is a better choice. According to the code elasticity.edp in ~/examples/hpddm, the setting of AttachCoarseOperator is a key step.

I’m not sure why ffddm would be more memory demanding (maybe you are keeping the global mesh?). Anyway, if you want performance, you should just use PETSc with -pc_type hpddm, it’s the most performance-oriented implementation, and there are examples in the same folder.

I’ll add that for a thermal stress problem, the odds of AMG working and greatly outperforming GenEO are close to 100%.

I also tested the method of using PETSc with -pc_type hpddm, but the consumed time is much larger than using PETSc with hybre/AMG (20 min vs 5.5 min for a test problem running on my desktop pc). The setting reads:

set(A, sparams = "-pc_type hpddm " +
"-pc_hpddm_levels_1_pc_type asm " +
"-pc_hpddm_levels_1_eps_nev 10 -pc_hpddm_levels_1_eps_type arpack " +
"-pc_hpddm_levels_1_sub_pc_type cholesky " +
"-pc_hpddm_levels_1_st_ksp_type gmres -pc_hpddm_levels_1_st_share_sub_ksp " +
"-pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps " +
"-pc_hpddm_levels_1_st_pc_factor_mat_solver_type mumps " +
"-pc_hpddm_coarse_p 2 -pc_hpddm_coarse_correction balanced " +
"-pc_hpddm_coarse_pc_type cholesky -pc_hpddm_coarse_pc_factor_mat_solver_type mumps " +
"-pc_hpddm_coarse_ksp_reuse_preconditioner " +
"-pc_hpddm_define_subdomains true -pc_hpddm_has_neumann true " +
"-ksp_max_it 100 -ksp_rtol 1.e-6 -ksp_type gmres -ksp_monitor -ksp_converged_reason ");

But I am not sure about it. Are there any errors or improvements?

The code using PETSc/hybre has been moved to the HPC of my lab, and was run by loading the container file freefem_latest.sif (version 4.12) via singularity. The key command is:

singularity exec -e ~/freefem_latest.sif bash -c "/usr/freefem/ff-petsc/r/lib/petsc/bin/petscmpiexec -np 40 /usr/freefem/bin/FreeFem+±mpi -nw xxx.edp

But severe error occurs:

MatCreate The Identifier MatCreate does not exist

Error line number 51:

// line 50-51
Mat A;
MatCreate(Th, A, Pk);

Are there any errors in the command?

Again, if either GAMG or BoomerAMG work, they will be hard to beat. How many iterations do they require to converge?

About 90 iterations.

Are there any errors in the command?

There is no error, you are simply using an extremely outdated FreeFEM installation.

Hi, prj. Let’s turn back to the beginning of this topic. The code calling hpddm directly can be run on the HPC normally (via singularity/ff_v4.12), but it converges very slowly (relative error of 0.1 after 200 iterations). I suspect that the coarse space is not built correctly. Thus I pose the problem of setting AttachCoarseOperator, especially the meaning of vPbNoPen.