Saturday, May 27, 2017

The following are standard inference algorithms for probabilistic graph models: 
  1. Sum product message passing algorithm also called belief propagation 
  1. Variable elimination method. 
  1. Junction-tree algorithm which is belief propagation applied on a modified graph guaranteed to be a tree. 
The first method is used as the forward backward algorithm for computing marginal distributions and as the Viterbi algorithm for the most probable assignments.  
The sum-product message parsing calculates marginal distributions for each unobserved node, conditional on observed nodes.  The marginal distribution of a single random variable is the summation of the mass function over all other variables. The mass function is merely the enumeration of probabilities of different states. For example, mass function is half for the toss of a coin arriving at one of the two outcomes and zero otherwise. 
This algorithm is best illustrated with a factor graph where the variables are on one side and the factors are on the other side. A function associating a variable node 'v' to a factor node 'a' and vice versa is each called a message. A message is a way of saying  that a sender variable thinks that another variable belongs in these states with various likelihoods. A message from v to a is the product of messages from all other neighboring factor nodes  except the recipient. A message from a to v is the product of the factor with messages from all other nodes, marginalized over all variables except the except the one associated with v. The complete marginalization is reduced to a sum of the products of simpler terms than the ones appearing in the full joint distribution.  The complete marginalization is reduced to a sum of the products of simpler terms than the ones appearing in the full joint distribution. 
Upon convergence, the estimated marginal distribution of each node is proportional to the product of all messages from adjoining factors. The estimated joint probability distribution is proportional to the product of the factor and the messages from the variables. Since the marginal distribution of each variable node is called belief, this method determines the beliefs of each variable. Viterbi algorithm is a special case of max product or min sum algorithm that is used for most probable explanation.  
Variable Elimination is used to estimate the marginal distribution over a subset of variables.  
Courtesy: wikipedia 
#coding exercise
Given a tree of boolean values where the nodes result from logical and operation of the children, fix the tree when a given leaf is inverted.
void Invert(Node root,Node leaf)
{
if (leaf.left != null || leaf.right != null) return;
leaf.data = ! leaf.data;
Fix(root);
}
void Fix(Node root)
{
if (root.left != null && root.right != null){
    Fix(root.left);
    Fix(root.right);
    root.data = root.left.data && root.right.data;
}
}


No comments:

Post a Comment