What is the correct way of using LinearCG?

Hi guys,
I’m trying to understand LinearCG by the following script

func real[int] A(real[int] i) {
  real[int] a= [1,2,3,4];
  return a;
}
real[int] x(2);
real[int] B=[2,3];
LinearCG(A,x,B);
cout << x << endl;

In my understanding, this will solve the linear equation

Ax=B

the result will be in x, but I got the following error

Exec error : CG2: Matrix is not defined (/0), sorry

Any suggestion? Many thanks!

Your operator is not linear, it is constant, thus CG fails.