# Differing Eigenfunctions in Single Core and PETSc

Hello! I am using PETSc / SLEPc to compute an eigenvalue problem. Comparing the results in parallel with a single process with a code in single core, I get the same eigenvalues and matrices, but the eigenfunctions differ. I’d like to export them later on for further computations.

As the generated matrices are identical in both Single.edp and Parallel.edp when running Parallel.edp on a single process, I expected the same for the eigenvectors. Maybe I misunderstand the format of the eigenvectors in SLEPc or is there some sort of scaling or something else that I miss? I want them to satisfy the equation Au = lambdaB*u.

I’m glad to hear any input, thank you!

Parallel.edp (2.5 KB)
Single.edp (1.7 KB)

``````\$ ff-mpirun -n 1 Parallel.edp -v 0
'/opt/homebrew/bin/mpiexec' -n 1 /Volumes/Data/repositories/FreeFem-sources-opt/src/mpi/FreeFem++-mpi -nw 'Parallel.edp' -v 0
++ WARNING: SLEPc-complex has been superseded by PETSc-complex
current line = 31 mpirank 0 / 1
Exec error :  Error points  border points to close < diameter*1e-7
-- number :1
Exec error :  Error points  border points to close < diameter*1e-7
-- number :1
err code 8 ,  mpirank 0
``````

How do you run it?

I’m sorry! ff -mpirun -n 1 Parallel.edp -nev 5 -mq 3 -ref 1 should work

I’ve checked with

``````\$ ff-mpirun -n 4 Parallel.edp -v 0 -nev 5 -mq 3 -ref 1 -eps_view_values -eps_view_mat0 ascii:A.m:ascii_matlab -eps_view_mat1 ascii:B.m:ascii_matlab
Eigenvalues =
0.07781+0.96891i
0.11059+0.96892i
0.16528+0.96894i
0.24194+0.96898i
0.34067+0.96902i
``````

In MATLAB, I do `eigs(Mat_0xc4000001_0,Mat_0xc4000001_1,5,0.0)` and get:

``````ans =

0.0778 + 0.9689i
0.1106 + 0.9689i
0.1653 + 0.9689i
0.2419 + 0.9690i
0.3407 + 0.9690i
``````

So eigenvalues are matching. There is no real reason for eigenvectors to match precisely, as what matters is the space spanned by the eigenvectors, not each individual eigenvector.

Ok, thank you. In the single code, when I export matrices A, B and an eigenpair (u,lambda), the equation Au=lambdaBu is satisfied. So I was hoping the same for the parallel code, but as the eigenvector is very different, it is not satisfied at all.

That’s because you are using the default `tgv` value of 10^30. Apply these changes:

``````[...]
varf b(u,v) = int3d(Th3) (u*v) + on(0,3,u=0);
[...]
A = a(Vh,Vh, tgv = -1);
B = b(Vh,Vh, tgv = -10);
[...]
``````

Then run with the options `-eps_error_backward ::ascii_info_detail -eps_error_relative ::ascii_info_detail -eps_error_absolute ::ascii_info_detail`, and you’ll get the proper errors:

`````` ---------------------- --------------------
k                ||Ax-kBx||
---------------------- --------------------
0.077808+0.968905i       1.26646e-10
0.110594+0.968920i       2.58495e-12
0.165283+0.968945i       3.23503e-13
0.241944+0.968978i       1.05112e-10
0.340670+0.969017i       1.10005e-10
---------------------- --------------------
---------------------- --------------------
k             ||Ax-kBx||/||kx||
---------------------- --------------------
0.077808+0.968905i        1.3029e-10
0.110594+0.968920i       2.65065e-12
0.165283+0.968945i       3.29118e-13
0.241944+0.968978i       1.05246e-10
0.340670+0.969017i       1.07096e-10
---------------------- --------------------
---------------------- --------------------
k                 eta(x,k)
---------------------- --------------------
0.077808+0.968905i         1.612e-11
0.110594+0.968920i       3.29007e-13
0.165283+0.968945i       4.11699e-14
0.241944+0.968978i       1.33736e-11
0.340670+0.969017i       1.39899e-11
---------------------- --------------------
Eigenvalues =
0.07781+0.96891i
0.11059+0.96892i
0.16528+0.96894i
0.24194+0.96898i
0.34067+0.96902i
``````

Thank you so much! That’s what I was looking for.
If I run multiple processes, what is the best way to export the (global) eigenvector to Matlab?

`-eps_view_vectors ascii:file.m:ascii_matlab`, I think.

1 Like