#algorithm discussion continued
Today we continue reading the flow networks. We were looking at max flow min cut theorem and the properties associated with it. We saw how it applied to Ford Fulkerson method. Each iteration of the method augmented the flow by considering a path in the residual network until there is no more. For each path considered, the residual capacity of that path is found and augmented to the flow network.
How the path is chosen is not mentioned by the method. This is usually done with Breadth First Search
We saw that with BFS, the shortest path distance for all vertices in the residual network increases monotonically with each flow augmentation. We saw that in the forward direction, when we assumed the distance to decrease, we contradicted ourselves because the next vertex v was one hop away from the current vertex u and since this vertex was not already considered its distance was the sum of the shortest distance to current vertex and one hop. This would mean that the distance to the new vertex before and after the decrease would be the same which is a contradiction.Therefore this distance increases monotonically.
We considered the case when (u,v) is not part of the flow before the augmentation Now let us say that it is part of the flow after the augmentation.. This means the augmentation increased the flow from v to u. This algorithm always augments flow along the shortest paths and therefore the shortest path from s to u has (v,u) as the last edge. In this case, the shortest distance to v is one hop less than the shortest distance to u. This in turn must be less than the shortest distance to u after augmenation because we maintianed that the shortest distance to u did not change from the get go. Also we know that after augmentation, the distance between u and v are only one hop apart now that they are connected. So the shortest path distance between source and v before augmentation works out to be two hops less than the shortest path distance between source and v after augmentation which contradicts our assumption that augmentation decreased the shortest distance . Therefore such a vertex v cannot exist and the shortest distance increases monotonically.
#codingexercise
Determine the floor of an input value in binary search tree
Int floor(node root, int val)
{
If (root == null) return -1;
If (root.data == val) return val;
If (root.data > input) return floor(root.left, val);
Int floorright = floor(root.right, val);
If (floorright == -1)
Return root.data;
Else
Return floorright;
}
Determine if there are two elements in a list that adds up to a number divisible by k
bool hasPairDivisiblyByK(List<int> nums, int k)
{
for (int i = 0; i < nums.Count; i++)
for (int j = i+1; j < nums.Count; j++)
if ((nums[i] + nums[j]) % k == 0)
return true;
return false;
}
bool hasPairDivisiblyByK(List<int> nums, int k)
{
Var h = new Hashtable<int, int>();
for (int i = 0; i < nums.Count; i++)
h[nums[i] % k] += 1;
for (int i = 0; i < nums.Count; i++){
Int rmndr = nums[i] % k;
If (rmdr == 0)
Return h[rmdr] > 1;
If( h[k - rmndr] > 0)
Return true;
}
return false;
}
If we want to save on space while keeeping the time complexity less than O(n^2), we could sort the numbers and for each of the multiples of k that lie between a min and a max in the range, we could find pairs that add up to it.
Today we continue reading the flow networks. We were looking at max flow min cut theorem and the properties associated with it. We saw how it applied to Ford Fulkerson method. Each iteration of the method augmented the flow by considering a path in the residual network until there is no more. For each path considered, the residual capacity of that path is found and augmented to the flow network.
How the path is chosen is not mentioned by the method. This is usually done with Breadth First Search
We saw that with BFS, the shortest path distance for all vertices in the residual network increases monotonically with each flow augmentation. We saw that in the forward direction, when we assumed the distance to decrease, we contradicted ourselves because the next vertex v was one hop away from the current vertex u and since this vertex was not already considered its distance was the sum of the shortest distance to current vertex and one hop. This would mean that the distance to the new vertex before and after the decrease would be the same which is a contradiction.Therefore this distance increases monotonically.
We considered the case when (u,v) is not part of the flow before the augmentation Now let us say that it is part of the flow after the augmentation.. This means the augmentation increased the flow from v to u. This algorithm always augments flow along the shortest paths and therefore the shortest path from s to u has (v,u) as the last edge. In this case, the shortest distance to v is one hop less than the shortest distance to u. This in turn must be less than the shortest distance to u after augmenation because we maintianed that the shortest distance to u did not change from the get go. Also we know that after augmentation, the distance between u and v are only one hop apart now that they are connected. So the shortest path distance between source and v before augmentation works out to be two hops less than the shortest path distance between source and v after augmentation which contradicts our assumption that augmentation decreased the shortest distance . Therefore such a vertex v cannot exist and the shortest distance increases monotonically.
#codingexercise
Determine the floor of an input value in binary search tree
Floor is the value of a node that is the largest of all those that are lower than the given one
Int floor(node root, int val)
{
If (root == null) return -1;
If (root.data == val) return val;
If (root.data > input) return floor(root.left, val);
Int floorright = floor(root.right, val);
If (floorright == -1)
Return root.data;
Else
Return floorright;
}
Determine if there are two elements in a list that adds up to a number divisible by k
bool hasPairDivisiblyByK(List<int> nums, int k)
{
for (int i = 0; i < nums.Count; i++)
for (int j = i+1; j < nums.Count; j++)
if ((nums[i] + nums[j]) % k == 0)
return true;
return false;
}
bool hasPairDivisiblyByK(List<int> nums, int k)
{
Var h = new Hashtable<int, int>();
for (int i = 0; i < nums.Count; i++)
h[nums[i] % k] += 1;
for (int i = 0; i < nums.Count; i++){
Int rmndr = nums[i] % k;
If (rmdr == 0)
Return h[rmdr] > 1;
If( h[k - rmndr] > 0)
Return true;
}
return false;
}
If we want to save on space while keeeping the time complexity less than O(n^2), we could sort the numbers and for each of the multiples of k that lie between a min and a max in the range, we could find pairs that add up to it.
No comments:
Post a Comment