Tuesday, November 4, 2014

#codingexercise matrix addition
Int [, ] addition ( int [,] left, int [,] right)
{
If (left == null || right == null) return null;
Int lc  = left.GetLength (0);
Int lr   = left.GetLength (1);
Int rc  = right.GetLength (0);
Int rr  = right.GetLength (1);
If (lc != rc || lr != rr) return null;
Var ret = new int [lr, lc]();
For (int r = 0; r < lr;  r++)
{
   For (int  c = 0; c < lc; c++)
     {
           Ret [r,c] += left [r,c] + right [r,c];
      }
}
Return ret;
}
#codingexercise matrix subtraction

Int [, ] Subtraction ( int [,] left, int [,] right)

{

If (left == null || right == null) return null;

Int lc  = left.GetLength (0);

Int lr   = left.GetLength (1);

Int rc  = right.GetLength (0);

Int rr  = right.GetLength (1);

If (lc != rc || lr != rr) return null;

Var ret = new int [lr, lc]();

For (int r = 0; r < lr;  r++)

{

   For (int  c = 0; c < lc; c++)

     {

           Ret [r,c] += left [r,c] - right [r,c];

      }

}

Return ret;

}

In continuation of our discussion on the convergence of steepest descent method, we will see that there is instant convergence even with a set of eigenvectors. For a symmetric matrix, there exists a set of n orthogonal eigenvectors of A. As we can scale eigenvectors arbitrarily, each of them is chosen as unit-length and the error term is expressed as a linear combination  of this eigenvector.  Then we see that the residual can be expressed as the sum of the eigenvector components. We saw that when the set has only one eigenvector, the convergence is achieved in one step by choosing the inverse of the eigenvalue. Here all vectors have a common eigenvalue and therefore leads again to a onestep convergence.

The lowest value of the function is at the minimum of the paraboloid.

No comments:

Post a Comment