We continue discussing the paper "Nested sampling for general Bayesian computation" by Skilling
We were looking at the transformation of computing the evidence based on the prior mass instead of the parameters. We looked at the integration performed. This paper essentially says don't navigate the parameter space. It is sufficient to explore a likelihood weighted space.
The nested sampling procedure is therefore :
Choose the number of classes as N and the number of iterations as j
Record the lowest of the current likelihood values
In each iteration
set the initial prior to the exponential of a value that depends on iteration sequence
set the weight to be the difference in previous and the current prior
increment the evidence based on the strip area
then replace the point of the lowest likelihood by new one drawn from within the likelihood
Increment the evidence by filling in the missing band with weight w = Xj/N for each surviving point
There are some salient points to consider in this simplification.
First, only one point is replaced with a new candidate for acceptance and this is the one with the worst likelihood value or prior mass equal to 1
Second random values in the range 0,1 suffice.to draw the samples within the constrained likelihood contour
Third the likelihood contours shrink by factors exp(-1/n) in area and are roughly followed by successive sampling points.
Fourth, we can calculate the information H during the iterations as a by product. It is the fraction of prior mass that contains most of the posterior mass.
Fourth the procedure takes about NH +- sqrt(NH) steps where H is information. The bulk of the posterior mass is reached in the first component and crossed in the second component. This follows from the premise that the individual log t are independent, its mean is -1/N and standard deviation is 1/N. After i steps the prior mass is expected to shring to log Xi which is roughly -(i +- sqrt(i))/N
Since the nested sampling procedure works on weighted sum basis, we could consider combining evidence after following the procedure for the current batch and then merging the evidence from the next batch with that of the current batch once the same log prior mass scale is aligned.
#codingexercise
Find if a string is the interleaving of two other strings
static bool IsInterleaved( StringBuilder A, StringBuilder B, StringBuilder C, int ia, int ib, int ic)
{
if (ia == A.Length && ib == B.Length && ic == C.Length) return true;
if (ic == C.Length) return false;
if (ia < A.Length && ib < B.Length && C[ic] != A[ia] && C[ic] != B[ib]) return false;
if (ia < A.Length && ib == B.Length && C[ic] != A[ia]) return false;
if (ia == A.Length && ib < B.Length && C[ic] != B[ib]) return false;
return (((ia < A.Length && ic < C.Length && C[ic] == A[ia]) && IsInterleaved(A, B, C, ia+1, ib, ic+1)) ||
((ib < B.Length && ic < C.Length && C[ic] == B[ib] && IsInterleaved(A,B,C, ia, ib+1, ic+1))));
}
We were looking at the transformation of computing the evidence based on the prior mass instead of the parameters. We looked at the integration performed. This paper essentially says don't navigate the parameter space. It is sufficient to explore a likelihood weighted space.
The nested sampling procedure is therefore :
Choose the number of classes as N and the number of iterations as j
Record the lowest of the current likelihood values
In each iteration
set the initial prior to the exponential of a value that depends on iteration sequence
set the weight to be the difference in previous and the current prior
increment the evidence based on the strip area
then replace the point of the lowest likelihood by new one drawn from within the likelihood
Increment the evidence by filling in the missing band with weight w = Xj/N for each surviving point
There are some salient points to consider in this simplification.
First, only one point is replaced with a new candidate for acceptance and this is the one with the worst likelihood value or prior mass equal to 1
Second random values in the range 0,1 suffice.to draw the samples within the constrained likelihood contour
Third the likelihood contours shrink by factors exp(-1/n) in area and are roughly followed by successive sampling points.
Fourth, we can calculate the information H during the iterations as a by product. It is the fraction of prior mass that contains most of the posterior mass.
Fourth the procedure takes about NH +- sqrt(NH) steps where H is information. The bulk of the posterior mass is reached in the first component and crossed in the second component. This follows from the premise that the individual log t are independent, its mean is -1/N and standard deviation is 1/N. After i steps the prior mass is expected to shring to log Xi which is roughly -(i +- sqrt(i))/N
Since the nested sampling procedure works on weighted sum basis, we could consider combining evidence after following the procedure for the current batch and then merging the evidence from the next batch with that of the current batch once the same log prior mass scale is aligned.
#codingexercise
Find if a string is the interleaving of two other strings
static bool IsInterleaved( StringBuilder A, StringBuilder B, StringBuilder C, int ia, int ib, int ic)
{
if (ia == A.Length && ib == B.Length && ic == C.Length) return true;
if (ic == C.Length) return false;
if (ia < A.Length && ib < B.Length && C[ic] != A[ia] && C[ic] != B[ib]) return false;
if (ia < A.Length && ib == B.Length && C[ic] != A[ia]) return false;
if (ia == A.Length && ib < B.Length && C[ic] != B[ib]) return false;
return (((ia < A.Length && ic < C.Length && C[ic] == A[ia]) && IsInterleaved(A, B, C, ia+1, ib, ic+1)) ||
((ib < B.Length && ic < C.Length && C[ic] == B[ib] && IsInterleaved(A,B,C, ia, ib+1, ic+1))));
}
No comments:
Post a Comment