2024 QuantNet Rankings of Financial Engineering (MFE) Programs

I just rechecked GATechs pricetag. UChicago easily wins, I've seen many scholarships that bring its cost under 40k. And many more than that which bring it in line with GATech. Nothing comes close (if you're a pretty good student) to the value of Baruch and Uchicago with scholarships.
Updated the ranking to point to this
I've seen some members here got the Alpha scholarship that goes up to 70% total tuition but currently the max is 60% off the total tuition. It is the most I have seen out of all programs.
 
Last edited:
Woah. I was looking forward to stony brook's ranking. But I guess they didn't provide placement statistics this year, too.
 
Hello, why do you think MIT is ranked this low? I am curious to know the employment statistics of only the graduates who went into Quant Finance, because MIT MFin is super flexible with a lot of people going into IB and Consulting
 
I would recommend the weight of peer assessment going down in the future. This component is highly subjective and may suffer from information asymmetry.
 
And UChicago improved a lot, from 12 to 6 ! Other notable changes include the improvement of Columbia MFE from 9 to 5 while Columbia MathFin did the exact opposite by falling from 5 to 9. Another surprise to me is that CMU took the 3rd place from Berkeley. What do you guys think about this new ranking ? Thank you @Andy Nguyen for your great work, it is truly helpful.
I predicted this will happen soon but didn’t imagine that soon!!

Does ranking matter all that much inside QuantNet's top 10?
 
Interestingly enough my opinions about the importance of ranking changed quite a bit from when I was a student to now working in the industry for a while. I gave a lot of weighting to ranking when I was applying to schools but from working at one of these shops I came to realize that JS/HRT/Optiver/Sig etc. really do not care and thus have no knowledge of the rankings whatsoever. My coworkers mostly only recognize the school name but wouldn't know that Princeton Mfin is placed signficantly higher (and is harder go get into) than MIT Mfin for example.
 
Interestingly enough my opinions about the importance of ranking changed quite a bit from when I was a student to now working in the industry for a while. I gave a lot of weighting to ranking when I was applying to schools but from working at one of these shops I came to realize that JS/HRT/Optiver/Sig etc. really do not care and thus have no knowledge of the rankings whatsoever. My coworkers mostly only recognize the school name but wouldn't know that Princeton Mfin is placed signficantly higher (and is harder go get into) than MIT Mfin for example.
What do you think can be attributed to the disparity in employment outcomes between Princeton Mfin and MIT Mfin (~100k difference in salary + sign on bonus)? Is it cohort size, career services, quality of students, all of the above?
 
Personally, I don't think median statistics are that useful because tbh the median student in MFE programs tend to be quite bad. I think the correct attitude to get into a top prop shop is to look at just the max statistic and try to set/beat that. In that regard, the best students at MIT Mfin aren't too different from the best students at Princeton.

That being said, the disparity in employment outcomes is because Princeton just has the highest concentration of talent (arguably tied with Baruch). Having a small cohort size does help in this sense because you can be a lot more selective - it's much easier to find 30 people with extremely good credentials than 100. Even before joining the program, Princeton students just generally have better credentials than a majority of people in other MFE programs and this helps a lot for the internship search that should start before your program start date anyways. At the top level, career services don't really do anything - it's mostly self driven, my firm does not care about your connections at all, we only value how we think you will perform. I dislike when people blame career services because I feel they use that as a crutch while they should be blaming themselves.

Given two very qualified students are applying to a Jane Street for example, whether they went to Princeton or MIT means almost nothing.
 
Last edited:
Hello everyone, I was looking at the variables in the ranking methodology used for the rankings. However, some variables are missing in the statistics for each program, such as those related to student selectivity (counting 25%) and the employer survey score (10%).
Does anyone know where it is possible to find this information, also for the past years?
 
Hello everyone, I was looking at the variables in the ranking methodology used for the rankings. However, some variables are missing in the statistics for each program, such as those related to student selectivity (counting 25%) and the employer survey score (10%).
Does anyone know where it is possible to find this information, also for the past years?
Employer survey score isn't going to be released. I think student selectivity is the acceptance rate, but I'm tired and not certain.
 
Hello everyone, I was looking at the variables in the ranking methodology used for the rankings. However, some variables are missing in the statistics for each program, such as those related to student selectivity (counting 25%) and the employer survey score (10%).
Does anyone know where it is possible to find this information, also for the past years?
Can there can be another ranking to show the same but without the peer score. Reason being was that peer reviews are anonymous and can be largely subjective. The other numbers seem to be more defining imo, and the 10% from peer score can go to placement and selectivity each
 
Back
Top Bottom