Now it looks like I can load the mpi code ok even without mpirun

Curious now though if there is an incompatibility between the mpi that comes with
the ff petsc and openmpi. It looks like my original concern was sensible- ff was
passing a list of strings without a count and without a null terminator that
was expected by the MPI implementation. Not sure if there are other diffs
or if that too was just spurious but still may be worth tracking down at some point.

I haven’t cone back to the “make check” fails but since this now works in my development
setting, Since there is no obvious way to be sure of mpi code being included, when
you are a few levels deep in includes, some kind
of better failure modes may be helpful.

ff-mpirun -np 4 Helmholtz_circle_Neumann.edp -wg

I’m assuming for the most part it is ok now. Have to find out what “-wg” does as
it seems required to get the plots to pop up.

I did note in the manual something about a real wave vector requirement.
As I was thinking about lossy media, what is the origin of this or is
complex ok? Still digging through the examples , cobra right now,
thinking about simple capacitor with plates connected by an “inductor”
lol. With ideal materials it may still have radiation losses :slight_smile: I guess if the cobracavity
geometry was tractable this may be too. Should be more fun now if
the computer stuff works :slight_smile:

If you type FreeFem++ into the command lin you see that -wg means “with graphics”.
For MPI graphics, I recommend using vtk output, or the following repo with a global mesh output: GitHub - samplemaker/freefem_matlab_octave_plot: Examples how to plot FreeFem++ simulation results from within matlab and octave