Yet another attempt at MFE ranking

Looks like the following schools made the cut:

CMU
Columbia
Cornell
NYU
Princeton
Rutgers
Stanford
Univ. Chicago
Univ. Michigan
UCB

And the following received an honorable mention:

Baruch
Boston Univ
Georgia Tech.
Univ. Toronto
 
Goes to show 1) how arbitrary rankings are, and 2) how biased journalists and "experts" are. Why should Rutgers and NYU be on the list but not Baruch? Could it be that one of the "experts" is an adjunct at NYU?
 
I'm sure they are all great programs. :D Just thank God that an affordable high quality program exists at Baruch.
 
bigbadwolf makes an excellent point. Furthermore, the article indicated that the rankings were compiled based upon the information found on the schools' websites combined with the sagacious wisdom of the panel. That's a winning formula... :D
 
I'm sure the people who select the list represents the complete view of Wall street's recruiting community. I'm 100% confident that the selection is done scientifically and has no bias. I can guarantee that nobody in the panel has any connection with any of the programs selected.

The board was charged with selecting the top 10 quant schools, based on Wall Street recruitability -- the programs from which Wall Street firms recruit and the programs that produce the best quants (and why).
I don't see any why part. Anyone saw it?

It's interesting why AdvancedTrading wants to go into this path. It's a no win situation for their credibility. Unless they get paid to do it, of course.
 
It seems like a pretty silly ranking, as they don't attempt to explain their methodology or provide any commentary on the programs. It's unfortunate, as it could be a useful resource, given the general paucity of information available to prospective students.

In all likelihood, they slapped this together on short notice, because they needed some content for their website, and hey, rankings produce a lot of hits.
 
There must be some reasons headhunters like Dominic avoid the ranking matter like a plague. Once you put your name behind any kind of ranking, you either have to defend your method and show that you know what you are talking about or risk looking like a fool.

Is there anyone out there who has worked with graduates from these program, know the structure/difference of each program and stay up-to-date to the ever changing landscapes of the financial engineering programs?

In my opinion, this is just another attention grabbing, depth lacking article from some website trying to benefit from the hype. They probably try to fill in the void left when FENews went belly up.
This thread is very aptly titled.
 
Hi,

I have an opinion on this matter.But I speak as an international student and and I don't really know how it goes on in US, so my opinion might be completely skewed. The Universities mentioned by the panel are all highly regarded and very well known in India. As you all might be aware that IITs are extremely regarded in India. From my personal experience I can share some facts. When I was hired by my company, the first question that was asked was "Are you from IIT" and only after that my interviews were scheduled. There are teams in my company that hire only from IIT (even though the quality of work is not that great). A very senior person (who is not from IIT) of another team with whom I share a great rapport told me that employees from IITs are paid 1.5 times more than employees from another univs.
And I also know friends from IIT who are paid very less because they ended up in domain or comapnies that do not pay much. Now let me tell you few things about me. I am not from a top branch like Comp Sci or Electronics or from the best IIT but I am probably one of the highest earners of my batch. And that's because I went into the right industry and I had that "IIT" brand name. But then again I know for fields like "Comp Sci" or probably Financial Engineering, you do not really need to be from a top Univ if you know your "subject". I know colleagues who hold degrees in Biology and yet they are doing extremely well in Financial industry because they know their "stuff". So, the bottomline is 'if you are good, you are good'.

Now let me also talk about the impact of rankings. Rankings do matter and make a difference. Although the impact is gradual and not sudden. So, prospective students do keep a track of it and they do notice Univs which are making improvements leaps and bounds (Baruch for instance). A standing testimony should be the number of students applying to Baruch from India. Let me further substantiate my standpoint. IIT-Kanpur was and in my opinion still the best IIT and was consistently ranked 1st in India. Then for 2-3 years IIT-Kharagpur was ranked 1st and the top rankers started opting for IIT-Kharagpur then. Now, with the emergence of Mumbai as the "financial hub" of India, IIT Bombay is the most sought after IIT and the top rankers are opting for it now. So, the equation keeps changing year after year but the strength of the IIT depends on the quality of the students it is able to attract. So, any Univ which attracts the best students will eventually be given its due recognition. But like I mentioned even that is "gradual" not sudden. In conclusion, finding fault in ranking is not the best way to put across a point, instead the current students and alumni should try to be "stellar performers" in their firms. And eventually this will start getting reflected in the rankings.
 
I'm no fan of rankings either, and as pointed out earlier in the thread, there is no "why" part in the article. So that hurts the credibility.

However, even if "the board" had presented an objective criteria for these rankings, there will be people pointing out flaws in those criteria. I bet those pointing out the flaws will be the ones associated with the institutions not making a high rank. Take, for example, the US News rankings. U.S News makes its ranking methodology public. But there are plenty of those complaining about its methodology. You wont see people from Stanford/Harvard/MIT etc amongst the complainants though.
 
I am not a fan of such lists, but this one is high in my list of bogus lists.

We can think of several methods for ordering MFE programs, but let us be clear that 4 non-entities do not represent a consensus on Wall Street. Also at least one of these people has a vested interest, but this is not highlighted...

A list good enough to warrant publishing has the following characteristics:

1) A published methodology
2) No one in charge who has a vested interest.
3) Opinions drawn from a wide range of employers, HH, HRs etc.
I take it you guys have done stats, for a list of 10, how would you work out the minimum size sample for a useful confidence. (Hint it is >4)

4) Objective numbers on the "value add" of the course, ie pre/post earnings.

5) And possibly most importantly, any such exercise must identify the crap programs.

I have written articles for money, and a good piece is measured by the value it adds to the reader. I do not believe that saying X is #2 and Y is #3 adds value, but I do believe that saying "course F is almost unknown, and the few who have heard of it, hate it" will help readers avoid crap.

Of course I have thus ruled P&D out of publishing such a list. I teach on the CQF, and Paul Wilmott is the course director, so am conflicted.
 
The Universities mentioned by the panel are all highly regarded and very well known in India.

Why should this hold for second-tier schools like NYU and Rutgers?

Rankings do matter and make a difference.

Why? You have to provide an argument (I may agree with you, but by the "principle of sufficient reason" you're still obliged to list reasons). What if the rankings are crud? Who decides on the rankings, and on what basis? If the argument for the MFE ranking is that it's the consensus of those hiring, were they extensively polled?

So, prospective students do keep a track of it and they do notice Univs which are making improvements leaps and bounds (Baruch for instance).

How? Particularly in the face of crud rankings? Conversely, I'm aware of one Midwestern university, with a good second-tier math department, which recently started its own program. The program is crud. The program has no real faculty except "industry experts." The syllabus hasn't been carefully designed. Yet they manage to con international students into signing up based on the general strength and reputation of the math department. I'm sure others here can list their own examples (e.g., Fordham).

but the strength of the IIT depends on the quality of the students it is able to attract

Only partly. Good students are a corollary of the general strength of an institution. That strength derives primarily from the strength and activity of the faculty and its commitment to research and graduate education. Take Dan Stefanica out of Baruch's MFE and the program will start drifting downhill. On the other hand, replace the current students with a less stellar lot, and the reputation won't take a major hit because of the strength of the program itself. I suspect you're thinking of the IITs, which serve as effective filters for the best India has to offer, and which churn out a reliable if unspectacular product. Yet the IITs are not known for outstanding faculty and don't register on the radar screen with regard to research. Universities like Harvard, MIT, and Cambridge do. Which brings me back to the point that assessing their strengths is a complex business: strength of faculty, nature of research, quality of graduate education, quality of facilities, and -- where relevant -- placement information.
 
I really wish I could agree more with BigBadWolf.
Sadly these lists get read, and thus some people who make some decisions will be believe them.

We could poll 500 recruiting managers, or rather we could ask them, but to persuade that number is a serious work item. Merely mailing them would not get the response necessary for credibility.

Also, I have contempt for any list that is "US only". Why ?

Many quants on Wall Street are not US educated, indeed on Wilmott.com there was a surprisingly bitter argument over why such a large % of Goldman Sachs quant types were French. For Fordham alumni, I will share that France is not a US state, even if Fordham lists Moliere as part of its "tradition".

You may care about that or not, but the idea that Wall St. (or Chicago) hires only from US schools all by itself tells me that the people running this survey had no bloody idea.

Also the ranking fails in its most basic purpose (other than to fill space on a page that you can surround with advertisements for MFE programs).

How does it help you make better decision ?
As a recruiter, I already have a view, so it can't help me.

For those choosing a MFE it lacks so much detail. With all due respect to Andy, the Admin around here, he is an amateur and yet got more useful info than this article.

Aside from status, things I'd like to know:
Which firms hire grads from each place ?

MFEs are conversion courses, how good are the tutorials at helping you fix the inevitable holes ?

What do the alumni think of the course ?

Which types of business do alumni go to ?
A school which helps you get into Credit has a very different proposition to one that focuses on Energy derivatives.

Where do students get internships ?

...and other stuff.

All of this could be found out with a bit of work, we could do it, and to an extent already have, but as above it would not be proper for us to do so, and commercially it is valuable stuff that we'd need a real motive to give away. (also we'd need to be able to defend some of the really harsh things we'd have to say about some programs, work, doable, but work).

I think Andy should continue his work on gathering info, then publishing the results.
I think it would sell.
 
Folks rankings do matter.

Its just that who looks at it.

Hiring managers- In India, hiring managers look at school rankings and also how hard are admissions to the school. I talked to a Wharton MBA few days back and she expressed opinion on similar lines about US.

Novice students/ applicants with lesser program research - These are the people most likely to take ranking as it is. If HAAS is number one; they can't think of anything else but HAAS.

Applicants with good research- They are not affected much by rankings as they know they can get to NYC and get to a wall street firm. They give importance to issues like location, fees ect.

Faculty-
I believe any teacher would love to see his/her school in best of reputation. Therefore, to feel proud about the place they work - faculty will feel good if rankings are good.

Its all about perception. Rankings improve perception. There is a theory of reflexivity; if I were to draw analogy... Ranking affects student quality and therefore, better student quality affects rankings. We are all very intelligent and smart people; why are we denying importance of ranking in competitive world. If competition exists, rankings do matter. If it doesn't, they don't. (How good published rankings are is just another question...)
 
Faculty-
I believe any teacher would love to see his/her school in best of reputation. Therefore, to feel proud about the place they work - faculty will feel good if rankings are good.

At a certain level, it stops mattering. The faculty become the school. Suppose Steven Shreve and the group around him were to move to Swampwater University. Then Swampwater would become a top-ranking quant school. To Shreve, his personal professional reputation may matter; that of the department or school he's with becomes secondary or immaterial. The same for the best students: they may want to be taught by Shreve or work with him. What school he's at becomes secondary. If they go into the job market and say they worked with Shreve, it has more significance than saying they got a GPA of 3.9 at Harvard. At the top level, the usual rules simply have no meaning, or are broken all the time. If I go to Derman and start speaking about some abstruse and cutting-edge theory, my previous GPA and work experience becomes irrelevant in his decision to hire me.

What you are saying applies to second- and third-rate academics who take great pride that they're at Swampwater U, which has a higher reputation that Jerkwater College. In like manner, the U you attended, and what GPA you got, assume vast importance in the hands of people who have no other way of assesing your worth (or even worse, are indifferent to your real worth), or where the resume is assessed by a computer program. This is part of an ongoing bureaucratisation of society. In this set-up, a physicist like Einstein or mathematician like Ramanujan would have remained obscure because of lack of appropriate GPA (Einstein), or lack of credentials (Ramanujan).

We are all very intelligent and smart people; why are we denying importance of ranking in competitive world. If competition exists, rankings do matter. If it doesn't, they don't. (How good published rankings are is just another question...)

What rankings are there other than published rankings? "Ranking" isn't some abstract concept that has an independent ontological existence. By definition, it means "published ranking." Ah, perhaps you mean a public consensus on ranking?
 
kapil354 is right, ratings do matter, which is why I get annoyed by the many **** ones that I read.
I do however disagree with bigbadwolf about how you get to be a "good school", however that is defined.

Many people base their opinion solely on fuzzy ideas about the school as a whole.
That is cynically exploited by several places, and unfairly hurts others.

Also there is a serious lag in having good people and that hitting the market.

BBW also asks what other rankings exist ?
My answer is that there are many. GS London has a different one to GS NY, each manager has his own, and we have spent some effort building our own, which may be unique amongst HHs.
 
Back
Top Bottom