We now look at Gram-Schmidt conjugation technique. This helps us generate a set of A-orthogonal search directions say di. This technique provides a simple way to generate them. We begin with a set of n-linearly independent vectors ui which are say co-ordinate axes or something better. To construct di, take ui and subtract out any components that are not A-orthogonal to the previous vectors. We begin with two linearly independent vectors u0 and u1. Set d0 = u0. The vector u1 is composed of two components one u+ that is along the direction of u0 and another it's A-orthogonal u*. After conjugation, only the A-orthogonal portion remains and d1 = u*. In the next step we will have two A-orthogonal vectors d0 and d1 and we construct the new one based on the previous two. We find the components along each of the previous A-orthogonal vectors so far. The components are found by taking some length in the direction of the previously found vectors. This length is determined the same way as we mentioned earlier with a pair of A-orthogonal vector. After conjugation, only the A-orthogonal portion remains to be included to our set. We repeat until we have generated a full set of n orthogonal vectors.
def gram-schmidt(set_ui, A):
set_di = []
for i in range(len(set_ui)):
components = 0
for k in range(i):
components += component(ui,dk) # - (utranspose A dk) / (dk-transpose A dk)
di = ui + components
set_di = di.append(component)
return set_di
Let us take a closer look at the conjugate directions method. We see that if the search vector is constructed by using the above method with axial unit vectors, it becomes equivalent to performing Gaussian elimination. If we take the orthogonal directions method with the same axial vectors and stretched it, it would look very much like the one with the conjugate directions.
def gaussian-elimination(A,n,b):
# augment the matrix A with b as the final column
# decomposition to upper triangular matrix by repeatedly multiplying the first row and subtracting from that row.
# back-substitution
for j in range(0:n-2): # start from the first column (we have to go column order first)
for i in range(n-1:j+1): # start from last row, repeat for each row above except for first
# pick a row to calculate the ratio
rows = [ r in range(0:i-1) if A[r,j] != 0 ]
if len(rows) == 0:
raise ('bad input')
row = rows[0]
if A[i,j] != 0:
ratio = A[i,j] / A[row,j]
for k in range(0:n-1): # set first column to zero, repeat
# and don't forget the augmented column
if ratio > 0:
A[i,k] = A[i,k] - ratio * A[row,k]
b[i,0] = b[i,0] - ratio * b[row,0]
else
A[i,k] = A[i,k] + ratio*A[row,k]
b[i,0] = b[i,0] + ratio * b[row,0]
# we now have a matrix with lower triangular zero
# substitute the last value in the last row to the row above to solve each line
def gram-schmidt(set_ui, A):
set_di = []
for i in range(len(set_ui)):
components = 0
for k in range(i):
components += component(ui,dk) # - (utranspose A dk) / (dk-transpose A dk)
di = ui + components
set_di = di.append(component)
return set_di
Let us take a closer look at the conjugate directions method. We see that if the search vector is constructed by using the above method with axial unit vectors, it becomes equivalent to performing Gaussian elimination. If we take the orthogonal directions method with the same axial vectors and stretched it, it would look very much like the one with the conjugate directions.
def gaussian-elimination(A,n,b):
# augment the matrix A with b as the final column
# decomposition to upper triangular matrix by repeatedly multiplying the first row and subtracting from that row.
# back-substitution
for j in range(0:n-2): # start from the first column (we have to go column order first)
for i in range(n-1:j+1): # start from last row, repeat for each row above except for first
# pick a row to calculate the ratio
rows = [ r in range(0:i-1) if A[r,j] != 0 ]
if len(rows) == 0:
raise ('bad input')
row = rows[0]
if A[i,j] != 0:
ratio = A[i,j] / A[row,j]
for k in range(0:n-1): # set first column to zero, repeat
# and don't forget the augmented column
if ratio > 0:
A[i,k] = A[i,k] - ratio * A[row,k]
b[i,0] = b[i,0] - ratio * b[row,0]
else
A[i,k] = A[i,k] + ratio*A[row,k]
b[i,0] = b[i,0] + ratio * b[row,0]
# we now have a matrix with lower triangular zero
# substitute the last value in the last row to the row above to solve each line
No comments:
Post a Comment