- Joined
- 11/5/14
- Messages
- 295
- Points
- 53
Hi all,
Some of you might already know, how to write a Monte-Carlo simulation to estimate [imath]\pi[/imath]. We use the uniform random number generator to produce a pair of pseudo-random numbers [imath](x_i,y_i)[/imath] in [imath][0,1][/imath]. After [imath]N[/imath] draws, we find the proportion of points lying in the unit circle, that is, those with [imath]x_i^2 + y_i^2 \leq 1[/imath] and divide this by the total number of draws. If [imath]N[/imath] is large, this ratio approaches the geometrical probability of the event that a random point drawn from the unit square lies in a unit quarter-circle: [imath]\pi/4[/imath].
While browsing Math SE, I recently came across a somewhat more challenging puzzle/tasks. How do you estimate the Euler's constant [imath]e \approx 2.71828\ldots [/imath] using the same generator?
My initial idea : If [imath]X[/imath] is [imath]Expo(1)[/imath] random variable, its CDF is [imath]F_X(x) = 1 - e^{-x}[/imath]. By the inverse transform method, [imath]F_X^{-1}(U)[/imath] is an exponential random variable. I generated a sample of [imath]U[0,1][/imath] random numbers, and applied the inverse transform. Since, [imath]F_X(1)=1-e^{-1}[/imath], we can find a count of the number of sample points less than or equal to unity. Their proportion should approach [imath]1-e^{-1}[/imath].
However, this approach is really bad (very slow convergence). Also, you aren't allowed to use log and circular functions. Here's the Stats SE thread, that discusses a much simpler and cool idea. (Also, I still need to prove that the random variable [imath]\xi[/imath] defined in the first answer to the OP's question has expectation [imath]e[/imath]. It'll make for an interesting exercise.)
Some of you might already know, how to write a Monte-Carlo simulation to estimate [imath]\pi[/imath]. We use the uniform random number generator to produce a pair of pseudo-random numbers [imath](x_i,y_i)[/imath] in [imath][0,1][/imath]. After [imath]N[/imath] draws, we find the proportion of points lying in the unit circle, that is, those with [imath]x_i^2 + y_i^2 \leq 1[/imath] and divide this by the total number of draws. If [imath]N[/imath] is large, this ratio approaches the geometrical probability of the event that a random point drawn from the unit square lies in a unit quarter-circle: [imath]\pi/4[/imath].
While browsing Math SE, I recently came across a somewhat more challenging puzzle/tasks. How do you estimate the Euler's constant [imath]e \approx 2.71828\ldots [/imath] using the same generator?
My initial idea : If [imath]X[/imath] is [imath]Expo(1)[/imath] random variable, its CDF is [imath]F_X(x) = 1 - e^{-x}[/imath]. By the inverse transform method, [imath]F_X^{-1}(U)[/imath] is an exponential random variable. I generated a sample of [imath]U[0,1][/imath] random numbers, and applied the inverse transform. Since, [imath]F_X(1)=1-e^{-1}[/imath], we can find a count of the number of sample points less than or equal to unity. Their proportion should approach [imath]1-e^{-1}[/imath].
Code:
import numpy as np
import math
def generate_e(N):
sum = 0
# Draw N samples from a uniform distribution
U = np.random.uniform(low=1e-9,high=1.0,size=N)
# Using the inverse transform technique, log(1/(\lambda(1-u)))~Expo(\lambda) distribution
lambda_ = 1.0
X = -np.log(lambda_ *(1.0 - U))
X2 = X[X<=1.0]
sum = len(X2)
e = 1 / (1 - (sum / N))
return e
estimate = generate_e(10**6)
print(estimate)
However, this approach is really bad (very slow convergence). Also, you aren't allowed to use log and circular functions. Here's the Stats SE thread, that discusses a much simpler and cool idea. (Also, I still need to prove that the random variable [imath]\xi[/imath] defined in the first answer to the OP's question has expectation [imath]e[/imath]. It'll make for an interesting exercise.)
Last edited: