In the previous post, we did not mention how we use the law of large numbers. We will complete that today.
We claimed that pi(y) = g(y) - g(y-1) and this satisfies the balance equation. We then attempted to say the Markov chain describing the supply and demand at an electronics store can be expressed as
Mn+1 = 0 or max Zj where 1 <= j <= n-1 where Zj was the sum of all differences between demand and supply till date.
and this we approximated to Mn+1 = 0 or (Mn + ksi-0) because we can replace the random letter variables ksi without affecting the probability.
P(Mn+1) = P(0 or (Mn + ksi-0)
= Sigma x>= 0 P(Mn = x) pxy
Since the n-dependent terms are bounded, we may take the limit of both sides as n-> infinity.
and lim n-> infinity for P(Mn = y) is our definition for pi(y)
we can write the earlier equation as pi(y) = Sigma x >= 0 pi(x) pxy. This shows that pi satisfies the balance equations however we need to prove further that Sigma-y pi(y) = 1
That is we have a weighted distribution of the probabilities where the sum is normalized to unity.
Here we say g(y) is not identically equal to 1 when the pi adds up to unity.
we prove this by using the assumption E.ksi-n = -mu < 0
P(lim n->infinity Zn/n = -mu ) = 1 from the law of large numbers.
This implies that the P(lim n->infinity Zn = -infinity) = 1
But we know that we wrote down P(lim n-> infinity Mn = 0 or max {z1, z2 ...} < infinity ) = 1
This gives g(y) = P(0 or max {z1, z2, ...} > y) is not identically equal to 1.
We claimed that pi(y) = g(y) - g(y-1) and this satisfies the balance equation. We then attempted to say the Markov chain describing the supply and demand at an electronics store can be expressed as
Mn+1 = 0 or max Zj where 1 <= j <= n-1 where Zj was the sum of all differences between demand and supply till date.
and this we approximated to Mn+1 = 0 or (Mn + ksi-0) because we can replace the random letter variables ksi without affecting the probability.
P(Mn+1) = P(0 or (Mn + ksi-0)
= Sigma x>= 0 P(Mn = x) pxy
Since the n-dependent terms are bounded, we may take the limit of both sides as n-> infinity.
and lim n-> infinity for P(Mn = y) is our definition for pi(y)
we can write the earlier equation as pi(y) = Sigma x >= 0 pi(x) pxy. This shows that pi satisfies the balance equations however we need to prove further that Sigma-y pi(y) = 1
That is we have a weighted distribution of the probabilities where the sum is normalized to unity.
Here we say g(y) is not identically equal to 1 when the pi adds up to unity.
we prove this by using the assumption E.ksi-n = -mu < 0
P(lim n->infinity Zn/n = -mu ) = 1 from the law of large numbers.
This implies that the P(lim n->infinity Zn = -infinity) = 1
But we know that we wrote down P(lim n-> infinity Mn = 0 or max {z1, z2 ...} < infinity ) = 1
This gives g(y) = P(0 or max {z1, z2, ...} > y) is not identically equal to 1.
No comments:
Post a Comment