- Joined
- 11/5/14
- Messages
- 321
- Points
- 53
Hey guys,
One of the standard results in elementary stochastic calculus is that, if X is a random variable, and Y is any square integrable random variable, then the conditional expectation E[Y∣X] exists and is unique. I am struggling to understand a particular step (inequality) in the first part of the proof, where the author proves that W=g(X) and (Y−Y⋆) are orthogonal. I have posted this as a question on MSE here.
I have tried searching the proof online, but the approach in my text appears a bit different. I would really benefit from any suggestions/clues with it.
Cheers,
Quasar
One of the standard results in elementary stochastic calculus is that, if X is a random variable, and Y is any square integrable random variable, then the conditional expectation E[Y∣X] exists and is unique. I am struggling to understand a particular step (inequality) in the first part of the proof, where the author proves that W=g(X) and (Y−Y⋆) are orthogonal. I have posted this as a question on MSE here.
I have tried searching the proof online, but the approach in my text appears a bit different. I would really benefit from any suggestions/clues with it.
Cheers,
Quasar