
In this article, I will introduce one of the statistics concepts, the Gambler’s Ruin Problem. This concept is relevant to gamblers when a gambler who raises his or her bet to a fixed fraction after a win but does not reduce it after a loss, will eventually go broke. [2] It is also a great example to understand a simple Random Walk or stochastic process. A stochastic process has the Markov property if the conditional probability distribution of future states depends only upon the present state and does not depend on past history. [4] The article will start with the problem statement, follows by the mathematical derivation, and finally the conclusion.
Problem Statement
Consider a gambler who starts with an initial fortune of i units bet on the outcomes of successive gambles i.e. successive flips of a coin. On each successive flip of a coin, the gambler will either win 1 unit if the coin comes up heads or lose 1 unit if it comes up tails. It is assumed that the successive flips of the coin are independent and each flip results head or tail with probability p and q = 1 – p.
Let Rₙ denote the total fortune after the nᵗʰ gamble. The gambler’s objective is to end up winning all the money i.e. N units given dealer got N – i units without getting ruined (running out of money). **** In any case, the gambler stops playing after winning all the money or getting ruined, whichever happens first.
Now we can derive the probability of the gambler ends up with all the money if he starts with i units and B start with N – i units where 0 < i < N.
Assign variables with definitions
Let P(H) = p = probability of coin flip results a head P(Hᶜ) = q = 1 − p = probability of coin flip results a tail
Rₙ = ∆₁ + ∆₂ + … + ∆ₙ , R₀ = i ∵ gambler start with i unit When the game proceeds, {Rₙ : n ≥ 0} forms a simple random walk where ∆ form i.i.d. sequence of r.v.s. distributed as P(∆ = 1) = p ∵ win 1 if the coin comes up head P(∆ = −1) = q ∵ lose 1 if the coin comes up tail And ∆ represents the earnings on the successive gambles
i.i.d r.v.s = independent and identically distributed random variables means collection of random variables has the same probability distribution as the others and all are mutually independent [3]
Since the game stops when either Rₙ = 0 or Rₙ = N, Let τᵢ = min{ n ≥ 0: Rₙ ∈ {0, N} | R₀ = i }, denote the time at which the game stops when R₀ = i. If Rτᵢ = N, then the gambler wins all money, if Rτᵢ = 0, then the gambler is ruined
Let Pᵢ = P(Rτᵢ = N) = probability of the gambler ends up with all the money when he starts with i units, R₀ = i
Derivation
The Gambler’s Ruin Problem can be modeled by random walk, starting with the initial stake, which will win or lose in each move with a given probability distribution. Since each move is independent of the past, it is essentially a Markov chain.
Next, based on Markov property, proceed to compute Pᵢ, we get

for i = 1, 2, …, N− 1, Note that P₀ = P(R₀ = N) = 0 and Pₙ = P(Rₙ = N) = 1
The derivation of the recursion equation is as follows: If ∆₁= 1, then the gambler’s total fortune increases to R₁ = i + 1 and so the gambler will now win with probability Pi+1. Similarly, if ∆₁= −1, then the gambler’s fortune decreases to R₁ = i − 1 and so the gambler will now win with probability Pi−1.
Since p + q = 1, we can rewrite the recursion equation as

Hence, we obtain

By adding the first i -1, we get

Here we can use Geometric Progression Series equation to simplify and get


Using the fact that Pₙ = 1, we get

Hence,

Note that 1 – Pᵢ is the probability of ruin.
Example
If a gambler were to start with 50 units to win 100 units, then what is the probability of the gambler winning (i) if p = 0.5 (fair game), (ii) if p = 0.45, (iii) p = 0.55?
(i) if p = 0.5, P₅₀ = 50/100 = 50% (ii) if p = 0.45, P₅₀ = [1– (0.55/0.45)⁵⁰] / [1– (0.55/0.45)¹⁰⁰] = 0.0044% (iii) if p = 0.55, P₅₀ = [1– (0.45/0.55)⁵⁰] / [1– (0.45/0.55)¹⁰⁰] = 99.996%
From here we can see that the probability of the gambler winning would be 50% if it is fair game p = 0.5, whereas it would drop to 0.0044% if p = 0.45. And it would increase to ~100% if p = 0.55.
This also means that if p > 0.5 (each gamble is in his favor), then there is a positive probability that gamblers will never get ruined but instead will become infinitely rich. On the other hand, if p ≤ 0.5 (each gamble is not in his favor), then with almost probability 1 the gambler will get ruined.
Conclusion
In conclusion, the Gambler’s ruin problem shows that a gambler playing a game with negative expected value will eventually go broke, regardless of their betting system. In other words, even a gambler with finite wealth playing a fair game (i.e., each bet has an expected value of zero to both sides) will eventually and inevitably go broke against an opponent with infinite wealth.
References
[1] Columbia University: Notes on Financial Engineering, 2018
[2] Gambler’s ruin – Wikipedia
[3] Independent and identically distributed random variables – Wikipedia