My simulation is adapted from the 3D magnetostatics example except I am using a much more complicated mesh. The line in question that is causing this out of memory issue is this:
Ax = Laplacian^-1 * LaplacianBoundary;
Which I assume is caused by the inverting of the laplacian matrix. I went to check the dimensions of this matrix using Laplacian.n and Laplacian.m and it is about 300,000 x 300,000. When the dimensions are around 150k x 150k I don’t get this out of memory issue.
What confuses me is that when checking my memory in task manager, my used memory doesn’t even approach maximum. So is there any work around this?
I was trying to implement the switch by follow the stuff under the documentation documentation
But I arrive at a load error with the first step with the line:
load “MUMPS_FreeFem”
Loading PETSc worked just fine if that’s relevant.
Does this mean something is missing in my installation?
I assume after I’m able to load it in, to switch from UMFPACK to MUMPS would be as simple as calling: defaulttoMUMPS(); before I invert the matrix?
At least if my understanding of the parallelizaiton examples is correct.
If you switch to the 64-bit version of umfpack (load "UMFPACK64"), you should be able to solve your system. In the long run, you’ll benefit by following prj’s advice and embracing PETSc.