Hello Olga,
"What is the difference between ways 1 and 3?"
Ok, good question. My text was not clear, let me clarify. When listing those 4 methods I thought that method 3 is solved numerically. Of course, if you can solve PDE analytically/in closed-form then it falls to case of method 1 (closed/semi-closed form solution).
"Why is the binomial tree method more difficult than MC?"
At least if terminal/sampling distribution is known for MC then it is just required that you:
i) build a sampling algorithm for that distribution
ii) draw random sample
iii) compute your function with drawn sample
iv) increase sample size (to gather data for drawing/assessing convergence)
v) GOTO ii)
With binomial tree you have tree and its "recursive structure" which you may need to travel in forwards/backwards. In my point of view it is more challenging programming task, in general case. But of course you may have other opinion(s) about this.
Here's a small problem to think for people considering Monte Carlo integration:
(no, I am not saying not to use MC but just saying that be careful...)
=======================================================
For simplicity, let's assume stupid model in which stock price at time t is of form
S(t) = signal(t) + noise,
where signal(t) is some deterministic, non-random, part and noise is random term (and independent of stock price at any time step). Let the noise term be Pareto distributed with parameters xm=0 and k=2 (it is zero-mean, see
Pareto distribution - Wikipedia, the free encyclopedia).
Now you are computing expectation E[g(S(T))] for some nicely behaving payoff function g using Monte Carlo integration. You do this by starting with some initial sample size, drawing random sample, computing average of payoff function of the drawn samples, then increase sample sizes, and repeat until "enough" accuracy.
Two questions:
1) what is average behaviour of the Monte Carlo estimate of expectation?
(answering: bad or good is enough)
2) is the any problems with the Monte Carlo estimate?
[I assume computer program is coded without bugs]
HINTS (for second question):
- If Xn is your Monte Carlo estimator for sample size n and let capital X be the value of the expectation being computed (non-random quantity), then is there any issues with convergence in the 2-th mean? (see "Convergence in mean" from
Convergence of random variables - Wikipedia, the free encyclopedia)
- check what changes if k is set to 3, does the situation change?
PS. If you are not familiar with terms estimate, estimator, (fixed) parameter. Then think that estimator is "like estimate but it is computed from non-known/random data which still follows the above stochastic process", whereas estimate is computed from a drawn sample (which is fixed as sample has been drawn). This is not good distinguishion between estimator/estimate (look books for proper definitions), but might do here.