Memory allocation problem in MPI computation using PETSc

Unfortunately I do not have any alternative rather than using large meshes for the solution of such a problem…

The good news is that using parmetis as a partitioner allows the execution to overcome the partitioning step and it arrives to the solution of the algebraic system. I’ll let you know if the problem is also actually solved!

The good news is that using parmetis as a partitioner allows the execution to overcome the partitioning step

I should have known better, in your first post, the error ***Failed to allocate memory for adjncy. comes from METIS. You could simply switch back to your initial code (and still use -Dpartitioner=parmetis), that should also make the problem go away.

Sorry @prj for the dealy of my answer, but I was waiting that my simulation begun on the cluster on which I’m working.

You are right, using parmetis also in the original script the metis error does not appear. By the way once the script reaches the “solve problem line → sol[] = AA * bb^1” a PETSc error appear, as reported below:

Do you think that changing the pc_type , maybe to hypre, may help to solve this issue? Is it straight forward to pass from -pc_type lu to -pc_type hypre? I’m seeing that in this line FreeFem-sources/maxwell-3d-PETSc.edp at master · FreeFem/FreeFem-sources · GitHub additional parameters are passed, for instance coordinates and gradient. What are they?

Is there someway a summary of all the possible choices -pc_type and -pc_factor_mat_solver_type that can be used in freefem?

Thanks again for your support!

Maybe changing the PC will fix things, cannot say for sure. The additional parameters are for using AMS: AMS — hypre 2.26.0 documentation. The list of PETSc preconditioners are available at KSP: Linear System Solvers — PETSc 3.18.0 documentation.