Functional Analysis or Ordinary Differential Equations?

Joined
11/27/18
Messages
6
Points
13
I am in my undergraduate and will be looking to apply to Quant Programs next year. This semester I have the choice between selecting Functional Analysis (FA) and Ordinary Differential Equations. I have been warned on the difficulty of FA but I am definitely intrigued by it and have been told that it has many applications in quantitative finance. Which course would benefit my application more. In other words, will taking Functional Analysis in my undergraduate stand out among other applicants? And is it more applicable to PDE than ODE is to PDE?

Your insights are greatly welcomed.
 
Are you a maths major?
I did FA at university and I am glad for that in my later career. It gives insight.
ODE can be good but not if it does only the theory part. Numerical ODE is a must.

Just doing FA just to "stand out" is not a good reason for doing FA..
 
Last edited:
Yes, I am a Maths Undergraduate. I am genuinely interested in FA, but I unfortunately have to choose between the two.
 
Last edited:
If you haven’t taken any ODE course before, it’s prob better to take it asap as it comes up more often in beginner’s quant finance and your lack of ODE/numerical methods stuff may be a red flag to some programs. I’m taking a topic class in analysis rn which covers a decent amount of functional analysis. FA is used a lot in probability theory but the stuff seems quite theoretical and may not add much value from an engineering point of view. But I just started learning FA so don’t take my words on this too seriously.

Edit: just learned how big of a role FA plays in modern probability. It's needed a lot in measure-theoretic probability and with a measure-theoretic approach, there is, in a sense, no distinction between discreet and continuous random variable, or univariate and multivariate distribution. It frees you from horrible algebraic manipulations. This explains it better (http://www.stat.yale.edu/~pollard/Courses/600.spring06/Handouts/Chapter1.pdf)
 
Last edited:
I would also vote for Numerical ODE. FA is important as a pure math foundation. However, it's not quite related from a computational science/engineering's perspective. I believe someone will have lots of fun studying FA, but if you won't use it on regular basis later in your study/work, you might forget it quickly. Use it or lose it. on the contrary, numerical methods covered in numerical ODE will be very useful no matter you are an applied mathematician, statistician or an engineer.
 
FA is the 'Great Unifier', i.e. it is able to encompass many pure/applied/numerical maths into a coherent whole. I see it as bringing my thinking to a clearer level as it were.



Functional analysis is an abstract branch of mathematics that originated from classical analysis.
The impetus came from applications: problems related to ordinary and partial differential
equations, numerical analysis, calculus of variations, approximation theory, integral equations,
and so on.
 
Last edited:
This is an undergraduate level course of ODE:

This is a graduate level course of ODE:

From what I know, schools in US have ODE courses with more or less similar contents as those two above.
 
Learning from hands-on projects is really helpful. I wish I'd known your course when I started my graduate school....
I noticed there is a great emphasis on Markov chain under ML part. There are quite a few topics in ML. It would be nice to know what made you decide to make that choice.

I have a hands-on Applied Numerical Methods online course that is very much focused on computational finance and Machine Learning (ML). A student project is a part of the learning process (with hands-on C++, Python)..

 
It would be nice to know what made you decide to make that choice.

  1. Much of the algos in ML I have already done in my previous sections. Didn't want to repeat myself.
  2. Markov models in general are powerful and I suspect future-proof.
  3. Intuition/gut feeling (I am not happy with GD and its many workarounds).
  4. Seems to have links to computationsl finance..
I don't do ML applications as such in the course, just try to take the mystique out of the story by examining the underlying maths in a bit more detail.
 
Last edited:
It is really nice to see a course/book written focusing on the math aspect of ML and decoding black-box.

I like Markov models too. It has already been put into great use in many important areas, like speech recognition, gene matching, etc. However, it appears to me the usage of it in finance still quite limited. I also hope to see Markov models play an important role in finance in near future.

Please don't hesitate to elaborate on your opinions about GD,I'd like to read them.

EM algorithms are widely used to estimate Markov models. I've read some papers hint that EM algorithm is actually a variation of Gradient descent/ascent method.
Like this discussion in StackExchange:


It would be nice to know what made you decide to make that choice.

  1. Much of the algos in ML I have already done in my previous sections. Didn't want to repeat myself.
  2. Markov models in general are powerful and I suspect future-proof.
  3. Intuition/gut feeling (I am not happy with GD and its many workarounds).
  4. Seems to have links to computationsl finance..
I don't do ML applications as such in the course, just try to take the mystique out of the story by examining the underlying maths in a bit more detail.
 
Last edited:
@Lynette Zhang
Sorry for the delay.
Here are some of the issues with GD as I see it.
  1. Inside GD lurks a nasty Euler method.
  2. Initial guess close to real solution (Analyse Numerique 101).
  3. No guarantee that GD is applicable in the first place (assumes cost function is smooth).
  4. "Vanishing gradient syndrome"
https://en.wikipedia.org/wiki/Vanishing ... nt_problem
  1. Learning rate parameter... so many to choose from (ad hoc/trial and error process).
  2. Use Armijo and Wolfe to improve convergence.
  3. Modify algorithm by adding momentum.
  4. Any you have to compute gradient 1) exact, 2) FDM, 3) AD, 4) complex step method.
  5. Convergence to local minimum.
  1. The method is iterative, so no true reliable quality of service (QOS).
  2. It's not very robust (cf. adversarial examples). Try regularization.

There might be some more.

//
What are the alternatives: gradient system (ODE) or solving an SDE in combination with Boltzmann annealing which seems to find the global minimum.
 
Last edited:
@Daniel Duffy

This is great! Thanks for sharing.

I know GD, but I don't have working experiences with GD. I've some experiences with EM algorithm applying to various Markov models. There are some points you mentioned I can relate as below:

  • Initial guess close to real solution (Analyse Numerique 101).
  • No guarantee that GD is applicable in the first place (assumes cost function is smooth).
  • Convergence to local minimum.
  • The method is iterative, so no true reliable quality of service (QOS).
  • It's not very robust
Besides those points, what I really dislike is the way many machine learning people treat problems. From my experience, standard Gaussian or Gaussian mixture based models are doing okay by using iterative optimization even though it has drawbacks as above. When the model became more complicated, if you take a close look at how they treated them, it's not rare to find them do some approximations or quick-or-dirty fix without saying it or without math reasoning behind it. Sometimes, some quick-and-dirty tricks don't make sense to me or they cannot fully justify itself. I like numerical methods, but I don't like quite a few ML methods because of this.

btw: I was asked by an interviewer who is a computer science guy about ML. I told him my thoughts. And he was pissed off right away. And I just wanted to discuss with him....
 
Back
Top Bottom