- Get link
- X
- Other Apps
Problem: Waldo rolls one standard (fair) 6-sided die repeatedly, keeping a running total of what he rolls. To three decimal places, what's the probability that he eventually reaches a total of exactly 1,000,000?
Solution: On average Waldo will be adding \( \mu = (1+2+3+4+5+6)/6 = 3.5 \) per roll, so we would "intuitively" or "naively" expect that the probability of hitting a particular total that's large enough would be close to \( 1/3.5 \). This turns out to be correct.
More generally, if we have any finite set of positive integers with no common factor, then the probability of hitting a particular total for large enough numbers is \( 1/\mu \), where \( \mu \) is the expected value for a single roll. Note that the probabilities associated with the positive integers in our set don't need to be the same, either, just greater than zero and summing to \( 1 \).
Proof: I'll show how you can prove this rigorously for the case where the outcomes are \( 1 \) and \( 2 \), each with probability \( 1/2 \). Let \( p(n) \) be the probability of totaling the value \( n \). The only possible ways to total \( n \) are to reach \( n-2 \) and immediately roll a \( 2 \), or reach \( n-1 \) and immediately roll a \( 1 \). This implies that \( p(n) = 1/2\cdot p(n-1) + 1/2\cdot p(n-2) \), which is a linear, second order, homogeneous difference equation (recurrence relation). I'm going to apply the theory here, but for more details see Wikipedia: Recurrence relation.
In this case the associated characteristic polynomial is \( r^2 = r/2 + 1/2 \) with roots \( r_1 = 1, r_2 = -1/2 \). So we have \( p(n) = c_1 + c_2\cdot (-1/2)^n \) for constants \( c_1, c_2 \). The boundary conditions are \( p(0) = 1 \) and \( p(1) = 1/2 \); solving we get \( c_1 = 2/3 \) and \( c_2 = 1/3 \). This implies that \( p(n) \) converges to \( 2/3 \) very rapidly; exponentially, in fact.
Proof: I'll show how you can prove this rigorously for the case where the outcomes are \( 1 \) and \( 2 \), each with probability \( 1/2 \). Let \( p(n) \) be the probability of totaling the value \( n \). The only possible ways to total \( n \) are to reach \( n-2 \) and immediately roll a \( 2 \), or reach \( n-1 \) and immediately roll a \( 1 \). This implies that \( p(n) = 1/2\cdot p(n-1) + 1/2\cdot p(n-2) \), which is a linear, second order, homogeneous difference equation (recurrence relation). I'm going to apply the theory here, but for more details see Wikipedia: Recurrence relation.
In this case the associated characteristic polynomial is \( r^2 = r/2 + 1/2 \) with roots \( r_1 = 1, r_2 = -1/2 \). So we have \( p(n) = c_1 + c_2\cdot (-1/2)^n \) for constants \( c_1, c_2 \). The boundary conditions are \( p(0) = 1 \) and \( p(1) = 1/2 \); solving we get \( c_1 = 2/3 \) and \( c_2 = 1/3 \). This implies that \( p(n) \) converges to \( 2/3 \) very rapidly; exponentially, in fact.
Really interesting. I wonder if you have a book where I can I understand better the difference equations?
ReplyDeleteEsteban, I'll write a blog entry on difference equations. They're nice to know and a very useful framework for analyzing recursive problems. Algorithms are a common setting, but far from the only one.
ReplyDelete