- Joined
- 8/9/12
- Messages
- 5
- Points
- 11
I am a student in a graduate computer science program and am deciding which course to take this fall semester. My goal is to work as a quantitative analyst, developer, or trader. Which of the following courses will best help me achieve my goal and why?
Optimization: Formulation of linear programming problems and solutions by the simplex method. Related topics such as sensitivity analysis, duality, and network programming. Applications include such models as resource allocation and production planning. Introduction to interior-point methods for linear programming.
Distributed Computing Principles: Studies the abstractions and algorithms that constitute the foundations for implementing concurrent and distributed computing, with emphasis on supporting fault-tolerance. Topics vary to reflect advances in the field but typically include global state snapshots, causality and clocks (logical and physical), agreement and consensus, primary-backup and state-machine replication, quorums, and gossip. Students undertake a substantial software project to put these ideas into practice.
Parallel Computer Architecture: Principles and trade-offs in the design of parallel architectures. Emphasis is on latency, bandwidth, and synchronization in parallel machines. Case studies illustrate the history and techniques of shared-memory, message passing, data flow, and data-parallel machines. Additional topics include memory consistency models, cache coherence protocols, and interconnection network topologies. Architectural studies presented through lecture and some research papers.
I am also taking Probability and Statistics II, Machine Learning, Heuristic Methods for Optimization.
Optimization: Formulation of linear programming problems and solutions by the simplex method. Related topics such as sensitivity analysis, duality, and network programming. Applications include such models as resource allocation and production planning. Introduction to interior-point methods for linear programming.
Distributed Computing Principles: Studies the abstractions and algorithms that constitute the foundations for implementing concurrent and distributed computing, with emphasis on supporting fault-tolerance. Topics vary to reflect advances in the field but typically include global state snapshots, causality and clocks (logical and physical), agreement and consensus, primary-backup and state-machine replication, quorums, and gossip. Students undertake a substantial software project to put these ideas into practice.
Parallel Computer Architecture: Principles and trade-offs in the design of parallel architectures. Emphasis is on latency, bandwidth, and synchronization in parallel machines. Case studies illustrate the history and techniques of shared-memory, message passing, data flow, and data-parallel machines. Additional topics include memory consistency models, cache coherence protocols, and interconnection network topologies. Architectural studies presented through lecture and some research papers.
I am also taking Probability and Statistics II, Machine Learning, Heuristic Methods for Optimization.