i try to estimate the H1-seminorm error of my solution, but unfortunatly i haven’t the analytical solution.
So i calculate my solution on a very dense mesh “mesh1” (around 170k vertices) to obtain the “real” one and after calculate it on a mesh “mesh2” with the same geometry but less dense (around 2000 vertices).
Now i want to compare the value of the “real” solution and the other solution in EACH VERTICES OF THE “mesh2”.
But seems that the program takes just the first 2000 value of the real solution, and not the ones in the same position. In detail i use this piece of code:
when i subtract u2 from u1 in the integral basically i take the value of the solution in each vertex of the mesh and perform the subtraction “one by one”.
My question is: the algorithm takes the value of u1 calculated in the same position “(x_vertex, y_vertex)” of u2 or take just the first 2000 value saved inside the solution (in different position)?
when iyou subtract u2 from u1 in the integral basically to take the value at the quadrature points at the quadrature point of each triangle and this depend of the term int2d(Th2) or int2d(Th1)
So, performing the error as you have proposed before, accepting the fact that my “real” solution is not strictly the real one but just an approximation, i obtain a reasonable estimate of the H1-seminorm?