I am pursuing a Graduate Degree in Mathematics, and this is my last semester. Among the courses I have taken that I believe to be useful for a quant are Probability, Stochastic Calculus, Mathematical Statistics, and two semesters of Linear Algebra. This current semester, I am deciding between the following six courses, I can take four of them. I have attached the course descriptions for each as well.
Numerical Methods I
Numerical errors, conditioning, and stability; function approximation; numerical linear algebra; root-finding and optimization; numerical integration, differentiation, and interpolation; spectral methods; Monte Carlo methods
Methods of Applied Math
This is a first-year course for all incoming PhD and Master students interested in pursuing research in applied mathematics. It provides a concise and self-contained introduction to advanced mathematical methods, especially in the asymptotic analysis of differential equations. Topics include scaling, perturbation methods, multi-scale asymptotics, transform methods, geometric wave theory, and calculus of variations
PDE I
This course is a basic introduction to PDEs and is designed for students who are interested in applied mathematics or analysis and PDEs. The concentration is on concrete examples of PDEs that arise in various physical systems, and methods of solving these problems will be introduced. The class will cover the following topics: first-order equations, methods of characteristics, conservation laws, shocks, weak solutions, Hamilton-Jacobi theory and caustics; wave equations, the method of spherical means, Duhamel's principle; the heat equation, the fundamental solution, diffusion and Brownian motion; Laplace's equation, maximum principle, fundamental solutions, Dirichlet and Neumann problems, boundary layer potential; Fourier methods and dispersive equations.
Numerical Optimization
A large number---one could even argue the majority---of problems in science, engineering, medicine, and business involve optimization problems in which we seek to minimize or maximize an ``objective function'' subject to constraints. This course will survey widely used methods for continuous optimization, focusing on both theoretical foundations and implementation as numerical software. Topics include linear programming (optimization of a linear function subject to linear constraints), line search and trust region methods for unconstrained optimization, and a selection of approaches (including active-set, sequential quadratic programming, and interior methods) for constrained optimization.
The course will consider both (i) mathematical analysis of the theoretical properties of optimization problems (such as optimality conditions) and methods (such as convergence); and (ii) numerical issues, such as how to compute the solutions of associated subproblems efficiently and stably.
High Performance Scientific Computing
This class will introduce the student to the fundamentals of parallel scientific computing. We will first establish an understanding of different types of machines from the point of view of large-scale floating-point-heavy workloads. This will include a study of CPU and GPU architectures, interconnects, and forms of parallel memory. The practicalities of programming these machines (MPI, OpenMP, OpenCL/CUDA) will be introduced, accompanied by homework assignments for each of hese three major approaches. Issues such as load balancing, communication, and synchronization overhead will be addressed throughout, and established practice in the field in the form of parallel numerical algorithms will be studied. Since a prerequisite for good parallel performance is good serial performance, this aspect of high-performance computing will also be addressed . Along the way important tools for scientific computing will be emphasized, including for example debuggers, Makefiles, version control systems.
Machine Learning
The course covers a wide variety of topics in machine learning, pattern recognition, statistical modeling, and neural computation. It covers the mathematical methods and theoretical aspects, but will primarily focus on algorithmic and practical issues.
Machine Learning and Pattern Recognition methods are at the core of many recent advances in "intelligent computing". Current applications include machine perception (vision, audition, speech recognition), control (process control, robotics), data mining, time-series prediction (e.g. in finance), natural language processing, text mining and text classification, bio-informatics, neural modeling, computational models of biological processes, and many other areas.
Some more background on myself and the courses: My course work in graduate school so far has been very light on programming. Machine Learning, Numerical Methods, Numerical Optimization, and High Performance Scientific Computing all have a programming component to the course. The first two allow you to use any language(I would most likely use Matlab for them though). While Numerical Optimization requires Matlab and High Performance Scientific Computing requires C++. I would like to work as a Quant and move into Quant Trading later on.
I'm pretty sure that I want to take Numerical Methods, Machine Learning, and Numerical Optimization. Of the three remaining courses, which would be the most useful to know and important in breaking into the industry? If you could choose any four of the six courses, again based on the same criteria as above, what would you pick?
Numerical Methods I
Numerical errors, conditioning, and stability; function approximation; numerical linear algebra; root-finding and optimization; numerical integration, differentiation, and interpolation; spectral methods; Monte Carlo methods
Methods of Applied Math
This is a first-year course for all incoming PhD and Master students interested in pursuing research in applied mathematics. It provides a concise and self-contained introduction to advanced mathematical methods, especially in the asymptotic analysis of differential equations. Topics include scaling, perturbation methods, multi-scale asymptotics, transform methods, geometric wave theory, and calculus of variations
PDE I
This course is a basic introduction to PDEs and is designed for students who are interested in applied mathematics or analysis and PDEs. The concentration is on concrete examples of PDEs that arise in various physical systems, and methods of solving these problems will be introduced. The class will cover the following topics: first-order equations, methods of characteristics, conservation laws, shocks, weak solutions, Hamilton-Jacobi theory and caustics; wave equations, the method of spherical means, Duhamel's principle; the heat equation, the fundamental solution, diffusion and Brownian motion; Laplace's equation, maximum principle, fundamental solutions, Dirichlet and Neumann problems, boundary layer potential; Fourier methods and dispersive equations.
Numerical Optimization
A large number---one could even argue the majority---of problems in science, engineering, medicine, and business involve optimization problems in which we seek to minimize or maximize an ``objective function'' subject to constraints. This course will survey widely used methods for continuous optimization, focusing on both theoretical foundations and implementation as numerical software. Topics include linear programming (optimization of a linear function subject to linear constraints), line search and trust region methods for unconstrained optimization, and a selection of approaches (including active-set, sequential quadratic programming, and interior methods) for constrained optimization.
The course will consider both (i) mathematical analysis of the theoretical properties of optimization problems (such as optimality conditions) and methods (such as convergence); and (ii) numerical issues, such as how to compute the solutions of associated subproblems efficiently and stably.
High Performance Scientific Computing
This class will introduce the student to the fundamentals of parallel scientific computing. We will first establish an understanding of different types of machines from the point of view of large-scale floating-point-heavy workloads. This will include a study of CPU and GPU architectures, interconnects, and forms of parallel memory. The practicalities of programming these machines (MPI, OpenMP, OpenCL/CUDA) will be introduced, accompanied by homework assignments for each of hese three major approaches. Issues such as load balancing, communication, and synchronization overhead will be addressed throughout, and established practice in the field in the form of parallel numerical algorithms will be studied. Since a prerequisite for good parallel performance is good serial performance, this aspect of high-performance computing will also be addressed . Along the way important tools for scientific computing will be emphasized, including for example debuggers, Makefiles, version control systems.
Machine Learning
The course covers a wide variety of topics in machine learning, pattern recognition, statistical modeling, and neural computation. It covers the mathematical methods and theoretical aspects, but will primarily focus on algorithmic and practical issues.
Machine Learning and Pattern Recognition methods are at the core of many recent advances in "intelligent computing". Current applications include machine perception (vision, audition, speech recognition), control (process control, robotics), data mining, time-series prediction (e.g. in finance), natural language processing, text mining and text classification, bio-informatics, neural modeling, computational models of biological processes, and many other areas.
Some more background on myself and the courses: My course work in graduate school so far has been very light on programming. Machine Learning, Numerical Methods, Numerical Optimization, and High Performance Scientific Computing all have a programming component to the course. The first two allow you to use any language(I would most likely use Matlab for them though). While Numerical Optimization requires Matlab and High Performance Scientific Computing requires C++. I would like to work as a Quant and move into Quant Trading later on.
I'm pretty sure that I want to take Numerical Methods, Machine Learning, and Numerical Optimization. Of the three remaining courses, which would be the most useful to know and important in breaking into the industry? If you could choose any four of the six courses, again based on the same criteria as above, what would you pick?