2023 QuantNet Rankings of Financial Engineering (MFE) Programs

Hi Andy,

I spoke with someone over email from career placement @ Columbia's IEOR and they said these statistics weren't accurate - can you please help me shed some light on who is correct; employment statistics are weighted heavily in my fall MSFE decision.

Thanks,
Roman
I'll play devil's advocate here, as data submitted by MFE programs are not as transparent as for law school applications, for example.

For 2021, for example, Columbia claims to have made 121 offers, and 119 people supposedly enrolled (as per Risk.net rankings 2021). Now, the circumstances are different as of today than they were during COVID; yet, I know 3 people this year alone, close to me, who have rejected their Columbia offers to accept an offer somewhere else. The likelihood of this happening, when compared with the provided data leads to a contradictory conclusion. Indeed, I am only close to a limited number of people, and have no exposure to other applicants who are many degrees of separation away from me. And yet within this already very limited circle, offer/enrollment rate is very far from that vouched by the program (ps: yes I know, very small sample size... my point is to be taken with a grain of salt).

1680553685942.png


Supposedly, the data provided to Risk.net is trustworthy. And yet, Columbia found itself amidst false numbers reporting in 2022, for example. This event is a demonstration that for many programs, it all boils down to a numbers game for rankings, and that no 3rd party (to my knowledge) is here to verify this information. A professor of Mathematics at Columbia (M. Thaddeus) accused the university of submitting "inaccurate, dubious, or highly misleading" statistics to US News. (link to his interesting analysis here: http://www.math.columbia.edu/~thaddeus/ranking/investigation.html) I quote the professor: "I was kind of radicalized by the experience of being department chair in mathematics from 2017 to 2020. That’s when I saw how secretive, how autocratic, Columbia’s administration is. How they never share relevant information with faculty or students or the public. This episode has just seriously damaged the credibility of the administration."


The conclusion remains to be drawn. However, I would say that Columbia's commitment to transparency is lacking, and deserves serious questioning. So far, there is no clear incentive for any programs to be 100% transparent, and my perceived reality of the situation, when comparing the tracker to their reported numbers, leads me to distrust Columbia. Had they provided data like CMU does, with their very detailed employment and career reports, I believe current opinions of Columbia would definitely improve. A lot of people have expressed their discontent with how Columbia communicates with prospective students.

Hope this help! Quite a complicated issue to tackle for sure.
 
Last edited:
I'll play devil's advocate here, as data submitted by MFE programs are not as transparent as for law school applications, for example.

For 2021, for example, Columbia claims to have made 121 offers, and 119 people supposedly enrolled (as per Risk.net rankings 2021). Now, the circumstances are different as of today than they were during COVID; yet, I know 3 people this year alone, close to me, who have rejected their Columbia offers to accept an offer somewhere else. The likelihood of this happening, when compared with the provided data leads to a contradictory conclusion. Indeed, I am only close to a limited number of people, and have no exposure to other applicants who are many degrees of separation away from me. And yet within this already very limited circle, offer/enrollment rate is very far from that vouched by the program (ps: yes I know, very small sample size... my point is to be taken with a grain of salt).

View attachment 47320

Supposedly, the data provided to Risk.net is trustworthy. And yet, Columbia found itself amidst false numbers reporting in 2022, for example. This event is a demonstration that for many programs, it all boils down to a numbers game for rankings, and that no 3rd party (to my knowledge) is here to verify this information. A professor of Mathematics at Columbia (M. Thaddeus) accused the university of submitting "inaccurate, dubious, or highly misleading" statistics for US News. (link to his interesting analysis here: http://www.math.columbia.edu/~thaddeus/ranking/investigation.html) I quote the professor: "I was kind of radicalized by the experience of being department chair in mathematics from 2017 to 2020. That’s when I saw how secretive, how autocratic, Columbia’s administration is. How they never share relevant information with faculty or students or the public. This episode has just seriously damaged the credibility of the administration."


The conclusion remains to be drawn. However, I would say that Columbia's commitment to transparency is lacking, and deserves serious questioning. So far, there is no clear incentive for any programs to be 100% transparent, and my perceived reality of the situation, when comparing the tracker to their reported numbers, leads me to distrust Columbia. Had they provided data like CMU does, with their very detailed employment and career reports, I believe current opinions of Columbia would definitely improve. A lot of people have expressed their discontent with how Columbia communicates with prospective students.

Hope this help! Quite a complicated issue to tackle for sure.
great but also terrifying post!
 
I spoke with someone over email from career placement @ Columbia's IEOR and they said these statistics weren't accurate - can you please help me shed some light on who is correct; employment statistics are weighted heavily in my fall MSFE decision.
Of course, employment stats are very important and we treat them as such. We have a process with all the programs to ensure we report correctly the numbers they provide. We don't audit their numbers.
We stand by the numbers we report in all of our rankings.
Hopefully, in the future, this is a concern of the past when there is a greater level of data transparency where applicants like you do not have to guess which data to trust. This is a huge investment of time, money and a life-changing decision for most people.
 
I'll play devil's advocate here, as data submitted by MFE programs are not as transparent as for law school applications, for example.

For 2021, for example, Columbia claims to have made 121 offers, and 119 people supposedly enrolled (as per Risk.net rankings 2021). Now, the circumstances are different as of today than they were during COVID; yet, I know 3 people this year alone, close to me, who have rejected their Columbia offers to accept an offer somewhere else. The likelihood of this happening, when compared with the provided data leads to a contradictory conclusion. Indeed, I am only close to a limited number of people, and have no exposure to other applicants who are many degrees of separation away from me. And yet within this already very limited circle, offer/enrollment rate is very far from that vouched by the program (ps: yes I know, very small sample size... my point is to be taken with a grain of salt).

View attachment 47320

Supposedly, the data provided to Risk.net is trustworthy. And yet, Columbia found itself amidst false numbers reporting in 2022, for example. This event is a demonstration that for many programs, it all boils down to a numbers game for rankings, and that no 3rd party (to my knowledge) is here to verify this information. A professor of Mathematics at Columbia (M. Thaddeus) accused the university of submitting "inaccurate, dubious, or highly misleading" statistics to US News. (link to his interesting analysis here: http://www.math.columbia.edu/~thaddeus/ranking/investigation.html) I quote the professor: "I was kind of radicalized by the experience of being department chair in mathematics from 2017 to 2020. That’s when I saw how secretive, how autocratic, Columbia’s administration is. How they never share relevant information with faculty or students or the public. This episode has just seriously damaged the credibility of the administration."


The conclusion remains to be drawn. However, I would say that Columbia's commitment to transparency is lacking, and deserves serious questioning. So far, there is no clear incentive for any programs to be 100% transparent, and my perceived reality of the situation, when comparing the tracker to their reported numbers, leads me to distrust Columbia. Had they provided data like CMU does, with their very detailed employment and career reports, I believe current opinions of Columbia would definitely improve. A lot of people have expressed their discontent with how Columbia communicates with prospective students.

Hope this help! Quite a complicated issue to tackle for sure.
As far as I know, they addressed this issue publicly and head on, didn't they? And this article refers to undergraduate college rankings.

Such transparency issues are part of most of the programs out there (without naming anyone) , not Columbia specifically. The Financial Engineering program could certainly do more to make admission statistics more transparent, but this doesn't create grounds for questioning their commitment to transparency.

I do agree with your point that the sample size you are in touch with is equivalent to a very small %, and yes things might be very different this year given a recent plunge in the rankings of the FinEng program.
 
Last edited:
As far as I know, they addressed this issue publicly and head on, didn't they? And this article refers to undergraduate college rankings.
I heavily doubt there is a severe dichotomy in how things are done, from undergrad management to grad management.

I would agree with you, on average, that there needs to be more transparency. However, I would say that some programs play the game and display all their numbers. It is no coincidence that Columbia remains subject to contention in numerous conversations on this forum. But quite a few other programs also have very questionable data, and I have seen excuses for why this data can not be shown.

For info, I am working with QuantNet on aggregating data for all programs. I do have quite a good impression of which programs play the game or not. Some have their data displayed but is hard to find. Others have data that contradicts their reports to other platforms. This is something I am actively investigating, for there to be more transparency during the application process.
 
Last edited:
I am glad people on this sub are starting to recognize some of the potential issues with the (or any) ranking system. I appreciate what this website provides and without trying to undermine or dismiss it's value, QuantNet should be used as a reference and not fact. It seems that students are not coming to QuantNet for a consult, rather looking for someone to make an important life decision for them. Incorrect/misleading data will always be an issue - a key reason why participants need to learn how to use this forum properly.

I would urge students to use this website as a tool and not mistake it for the toolbox itself. QuantNet alone should not be your only input when deciding where to attend a MFE/masters. The quant space could really benefit from thinking outside the box a little better on matters like education, employment, networking.

Andy's job isn't to tell you where to go to school, it is to aggregate some of the available data to help us make that decision.
 
I am glad people on this sub are starting to recognize some of the potential issues with the (or any) ranking system. I appreciate what this website provides and without trying to undermine or dismiss it's value, QuantNet should be used as a reference and not fact. It seems that students are not coming to QuantNet for a consult, rather looking for someone to make an important life decision for them. Incorrect/misleading data will always be an issue - a key reason why participants need to learn how to use this forum properly.

I would urge students to use this website as a tool and not mistake it for the toolbox itself. QuantNet alone should not be your only input when deciding where to attend a MFE/masters. The quant space could really benefit from thinking outside the box a little better on matters like education, employment, networking.

Andy's job isn't to tell you where to go to school, it is to aggregate some of the available data to help us make that decision.
Exactly, this was on the tip of my tongue. Rankings, by design, can never be all encompassing, and should just be used as a small guiding factor. And I guess all of us appreciate QuantNet and Andy's pursuit in providing us with the most reliable data to help make our decisions easier.
 
I heavily doubt there is a severe dichotomy in how things are done, from undergrad management to grad management.

I would agree with you, on average, that there needs to be more transparency. However, I would say that some programs play the game and display all their numbers. It is no coincidence that Columbia remains subject to contention in numerous conversations on this forum. But quite a few other programs also have very questionable data, and I have seen excuses for why this data can not be shown.

For info, I am working with QuantNet on aggregating data for all programs. I do have quite a good impression of which programs play the game or not. Some have their data displayed but is hard to find. Others have data that contradicts their reports to other platforms. This is something I am actively investigating, for there to be more transparency during the application process.
Appreciate the work you are doing buddy, hope this makes programs more willing to disclose the correct statistics.
 
I am glad people on this sub are starting to recognize some of the potential issues with the (or any) ranking system. I appreciate what this website provides and without trying to undermine or dismiss it's value, QuantNet should be used as a reference and not fact. It seems that students are not coming to QuantNet for a consult, rather looking for someone to make an important life decision for them. Incorrect/misleading data will always be an issue - a key reason why participants need to learn how to use this forum properly.

I would urge students to use this website as a tool and not mistake it for the toolbox itself. QuantNet alone should not be your only input when deciding where to attend a MFE/masters. The quant space could really benefit from thinking outside the box a little better on matters like education, employment, networking.

Andy's job isn't to tell you where to go to school, it is to aggregate some of the available data to help us make that decision.
One can hypothesize that being able to critically think about data in this way and discern what is useful info and what is 'noise' would itself be predictive of future success as a Quant.
 
There should be some way to account for the bias towards salary and employment rate. $169,516 in NYC standardized by the local CPI denominator. An in-state, out-of-state split on various column would be telling as well. Also, with a column like "Employment Rate 3-Months after Graduation (US only)". Employment within the US, and employment for US Citizens, are two different things. The former being the one most would want to see, but these schools attracts international talent. If they returned to their home countries and got a job, that's relevant. If they got employed in the US (especially within 90-days considering the visa circus) that's also relevant. Specific slices like this may expose things the programs don't want seen.
 
In the upcoming 2023 Risk.net ranking of quant master programs, the top 3 programs (#1 Baruch, #2 Princeton, # Berkeley) are identical to the 2023 QuantNet rankings which was released in Dec 2022.
Even when using a different methodology, their top 3 results are the same as our rankings. It would be interesting to see where other US programs line up.
Here is a preview of the new ranking.
 
The full ranking is now posted at

2023 Risk Quant Finance Master’s Programs Ranking

1. Baruch College
2. Princeton University
3. UC Berkeley
4. North Carolina State University
5. New York University
6. Columbia University (MFE)
7. ETH (Switzerland)
8. Ecole Polytechnique/Sorbonne
9. Imperial College London
10. Columbia University

Ranking Methodology
5% – Average class size
10% – Acceptance rate
10% – Percentage of offer-holders who enroll
5% – Ratio between students and lecturers
10% – Percentage of industry-affiliated lecturers
30% – Employment rate six months after graduation
5% – Faculty citations
25% – Average salary, adjusted for purchasing power
 
Does anyone know if the salaries risk.net reports are base, or some combination of base/sign-on/end-of-year-bonus?
I'm fairly certain they are just base salary which would explain the discrepancies between their data and quantnet's. Between the two sites you can extrapolate sign-on bonus amounts. For example, quantnet has Princeton at $199,000 total guaranteed while risk.net has them at $119,000 salary meaning the average student there gets an $80,000 signing bonus which is definitely an outlier when compared to other much lower bonuses. Baruch graduates on the other hand have most of their compensation as salary which is optimal because it is more money in the long run assuming similar pay trajectories between the two schools
 
I'm fairly certain they are just base salary which would explain the discrepancies between their data and quantnet's. Between the two sites you can extrapolate sign-on bonus amounts. For example, quantnet has Princeton at $199,000 total guaranteed while risk.net has them at $119,000 salary meaning the average student there gets an $80,000 signing bonus which is definitely an outlier when compared to other much lower bonuses. Baruch graduates on the other hand have most of their compensation as salary which is optimal because it is more money in the long run assuming similar pay trajectories between the two schools
Base makes the most sense, but I'm not sure how to get around that 80k bump.
Berkley would have a 40k bump, which is more reasonable though still very large.
Every other school's difference makes sense.

The article @Andy Nguyen linked must not be exaggerating, that is a good size increase in Baruch's base pay. Their first year comp reports have had them around 135k, they got a 10% boost if this holds.
 
Princeton at $199,000 total guaranteed while risk.net has them at $119,000 salary meaning the average student there gets an $80,000 signing bonus
I'm fairly confident the breakdown is closer to a 150k base and 50k sign on bonus (for median). I'm highly skeptical an average student gets an $80,000 signing bonus as there are very few firms that would give that much, and highly skeptical an average student at Princeton gets a $119,000 base when there are too many firms that would give more than that. I wouldn't fully trust any salary/employment 'stats'- there are a lot of inconsistencies as almost every source reports something different.
 
I'm fairly confident the breakdown is closer to a 150k base and 50k sign on bonus (for median). I'm highly skeptical an average student gets an $80,000 signing bonus as there are very few firms that would give that much. I wouldn't fully trust any salary/employment 'stats'- there are a lot of inconsistencies as almost every source reports something different.
You can trust some more than others. Baruch makes it really easy to find their employment and compensation stats and they appear to be constant across three platforms (Baruch's own website, risk.net, and QN). This doesn't clear them, Andy has said on this thread that he doesn't vet the numbers he gets but works to try and get the most accurate first reporting. It is possible they just put up a united front of misinformation, but I kinda doubt it given how often they put themselves out there.

I once questioned their marketing of the RITC competition (its usefulness and other aspects) and the entire winning team from this years competition came on the forum to clear up any discrepancies and provide great information. If Baruch was lying about their comp we would probably know about it by now, and they would get horrible reports under their college on here like NYU Tandon got for several years. It's hard to bribe 20-28 college kids into working as a propagandists- especially if they just realized they will make 80k less this year than expected.

I'm hesitant to trust all reporting, but I don't think it can be wildly off all the time. Besides, if Princeton was going to lie about this they wouldn't try to downplay it. I'm not willing to throw out the information reported because one school might be getting a ridiculous outlier of an average sign-on bonus. Just do the research and talk to graduates to figure out the weird one-offs.

I'm hesitant, but I trust the large majority of the reports for the schools I've looked into. 80k is the most doubtful thing I've seen though.
 
Last edited:
Besides, if Princeton was going to lie about this they wouldn't try to downplay it. I'm not willing to throw out the information reported because one school might be getting a ridiculous outlier of an average sign-on bonus. Just do the research and talk to graduates to figure out the weird one-offs.
Not really doubting the $199,000 total - I'd be a huge seller of the 119k/80k breakdown though especially as I personally know a good chunk of the Princeton graduating class. I think the error is also just compounded in approximating the sign on bonus by using one number from one source and another number from another source
 
Not really doubting the $199,000 total - I'd be a huge seller of the 119k/80k breakdown though especially as I personally know a good chunk of the Princeton graduating class. I think the error is also just compounded in approximating the sign on bonus by using one number from one source and another number from another source
Interesting. I'm not sure what you mean by seller here though, but I assumer it is in line with previous comments.
 
Back
Top Bottom