• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

NEW MFE Rankings ???

Joy Pathak

Swaptionz
Joined
8/20/09
Messages
1,330
Points
73
So....

I am thinking of building a new set of rankings due to the overwhelming response I have gotten through emails to compare programs for people. I might turn it into a full fledged paper too maybe...or something of that sort.

I have got 3 volunteers who will help me collect data also...so it might be a possibility. It will prolly be up by Mid May or later.

I am just going to pick 25 university that commonly show up on comparisons by people. There are SOOOOOOOOOO many programs now, that I am not going to bother with the rest. I will compare 25 maybe even 20 depending on how much time will go into it.

The programs will be strictly... Financial engineering, Financial Mathematics, and if I can find MS Finance where there are concentrations for the above then so be it.

I am looking for advice on variables.


This is what I have right ... IN NO ORDER.

1) Location - If you are in a 1 hour distance of a financial center then you get 10 points. and you loose 2 points for every additional hour. So if you are farther than 5 hours or farther you get 0. Distance will be calculated by what Google Map Shows me.

2) # of Practitioners on the Teaching Faculty for the Program - if you have 5 and above you get the full 10.

3) Strength of the Business School - I will pick one of the standard rankings list for this. and will use that to give points accordingly. Probably US News Week or something. This is more about the general name of the business school on a large scale.

4) Reviews put on QuantNet - If your school students has reviews on quant-net...and if they are good overall you get 5 full points. If your students dont have the time to put a review to promote their own university it seems like they might not be doing a good job...etc etc.. (This is up for debate)

5) Placement Statistics - if you have placement stats on your website ... then you will be included in this list and according to the number of students who have jobs you will be ranked...or 0. I might take a slide on this one, and maybe call them the program director and ask. If I can get through easily and get stats on the phone, you will get 5/10. (up for debate)

6) Age of the program - The older the better. If you're 5 years and older, you get 10 points. 0 otherwise. (up for debate).


Let me know? If this is totally bogus...feel free to mention them. Also, let me know what program to include. I will use most of the programs out of the previous MFE Ranking on QN, and will add some and remove some.
 
Are you planning on including Columbia MSOR as well? It is interesting to see how the program fairs against other MFE programs.
 
I don't think he would do that. Columbia MSFE would be taken in to account alone, as far as I know. Any comment Joy?
 
Are you planning on including Columbia MSOR as well? It is interesting to see how the program fairs against other MFE programs.


I will have to think about that. I don't think it is fair to include an Operational Research program. Columbia's MFE program and Fin Math will be included most likely.

Putting up 3 programs of Columbia might be too much. Maybe if you can send me the data for some of the variables I will entertain it.
 
Interesting set of proposal on the new ranking. It is only going as far as the data you can collect.

Including MS Finance programs would be a murky matter. They are just common place as MBA program so it may make sense to narrow it down to MFE, MathFin (FinMath) programs only.

Columbia MSOR should belong to the group of Operation Research programs. You can't compare them when you have no data from them.
 
Interesting set of proposal on the new ranking. It is only going as far as the data you can collect.

Including MS Finance programs would be a murky matter. They are just common place as MBA program so it may make sense to narrow it down to MFE, MathFin (FinMath) programs only.

That's the great part of the methodology. If data is hard to find you get a 0 for poor marketing of your program. :) That's why for placement stats, I am willing to call them once for grace. The rest should not bad hard. I have got a bit of data for few programs. I have 2 other guys collecting data also and another guy will start as soon as his exams are done.

If there are more people volunteering I can increase the size of the sample. Even if they send for their own respective programs.

Regarding MS Finance, I will only add the one that have strong quant curriculum's and quant concentrations.

E.g Princeton MS Finance, IIT MS Finance (Fin Eng concentration). MIT Ms Finance wouldnt be included. Those are the only two MS Finance I am thinking of including.
 
As far as data goes, I'd like to see the data for the past 5 years of any program. Giving partial, one year data point is not going to help painting the whole picture.


It will be current available data. How am I suppose to find 5 years data on faculty, or placements. I will have to look through archives of websites. Professors who teach part-time while working, keep changing at times possibly.

Whatever is available online, I will just gather that. For placements it will most likely be last years data or maybe 2 years depending on whats available. If the faculty shown currently includes what is required then it will be gathered. Most likely it will be everything available for 2009. It will still help giving a decent comparison considering there are several variables being chosen and ranking won't be based on one particular fact.
 
You don't look online. You approach them directly.
A program either decides to publish data or they don't. There is no point in getting "good/public/marketing" data.
They may make an excuse that you get the old, outdated data on their website if you don't ask them. In any case, they should know you solicit data for what purpose.
 
1) I disagree with location. First, what would you define as financial centers? I think the important aspects of location are captured in job placements. The assumption is that if your program is closer to a financial center than it will be easier to get a job, but if you get a good job who cares where your university is located?

2) I think having practitioners is good and important, but you are assuming that they are flat-out better. I would want a program to have a mix of practitioners and academics. An imperfect way to judge the quality of an academic would be what university they received their PhD from (or number of papers published maybe).

3) Like it. USNews, BusinessWeek, or FT depending on what their methodology is.

4) Disagree. What happens if a program has no students on this forum? This would biased unless you were able to compensate.

5) Like it. Just be careful because different programs are likely to present the statistics differently. For example, Baruch class of 2004 had 8 of 10 placements but 14 graduates, 4 went on to PhD. Saying 8 of 10 would be a correct statistic but saying 8 of 14 would be wrong. The difference is easy with Baruch because of how the data is represented but others may be different.

6) Disagree. Yes the current trend is that older programs are better, but what makes a program/university good is not its age but quality of teaching, reputation, brand, etc.

I don't know if this would work but selectivity could be a possible variable (admittance rate, average GRE score, etc). This assumes that a program is popular because it is good.

What about including cost and duration? Not as heavily weighted as say placement or teach facility but these are both big issues for students.

Almost tempted to throw in ASU finance ranking as a measure of the quality of the overall finance department: http://wpcarey.asu.edu/finance/rankings/ but useful is debatable.

Would like to see a measure of curriculum quality, but this would first require a agreed upon baseline. This could cover things like number of electives/options available, amount of math/stat covered, programming in C++,.... Highly objective I'm afraid.

Problem with including past data is this would limit your pool of comparable programs.
 
Good points.

I think location is an important factor due to the reason that the networking possibilities are enormous compared to a program that is not near a financial center. Obviously weight-age would be low compared to other variables. The financial hubs considered would be New york, Chicago and San Francisco. I might include LA, but have not decided yet.

The reason I believe practitioners are important is due the fact that this is a professional degree where we want to be able to implement the knowledge gained instantly and keep up with the current trends. There isn't much published research in this field as most of the work done is of proprietry nature. Although the practitioners cannot tell you this information they can guide in the right direction unlike a research professor. I do agree that some of the basic courses should be taught by academicians, but all in all, you want knowledge that you will you the day after graduation and practitioners make this happen. If this was a PhD degree comparison it would be opposite obviously.

The QuantNet review variable is up for debate. I have not settled on it. If there are lots of people against having it as a possible variable I will not include it.

The reason I included older programs better is because older programs have a mature curriculum...almost always. If you compare the older programs it can be seen how their programs have evolved. Older programs also have a larger alumni working in higher positions hopefully or atleast assumed. I think this is a decent gauge since this is a professional degree. The program directors have seen all the mistakes, and have grown the program to not make the mistakes again. The new programs cannot claim this. At the best a new program can do is copy an older program's curriculum.

Actually, I am going to include cost, duration, and internship opportunity. Good point on the first two.

I have not added curriculum as it will increase the ammount of additional effort I will have to put in. I am trying to keep it relatively simple. If I get more volunteers I will add it.

Admission statistics can be weird, because of the size of different programs. I don't think that is an entirely fair statistic. Some programs are very well advertised and might get more applicants too.

I am trying to keep the number of programs low... to around 20-25 as I mentioned. The programs picked would be from the previous ranking on QN with few changes. If anyone wants additional programs, send me data for it.
 
...reputation, brand...

This is very subjective. You can't measure any of this. So, if you can't measure it, it'll be better not to take it into consideration.
 
The reason I believe practitioners are important is due the fact that this is a professional degree where we want to be able to implement the knowledge gained instantly and keep up with the current trends. There isn't much published research in this field as most of the work done is of proprietry nature. Although the practitioners cannot tell you this information they can guide in the right direction unlike a research professor. I do agree that some of the basic courses should be taught by academicians, but all in all, you want knowledge that you will you the day after graduation and practitioners make this happen. If this was a PhD degree comparison it would be opposite obviously.

Using "practitioners" as a criterion is problematic. Programs on shoestring budgets are using cheap-to-employ "practitioners" to cover for not having real staff. And these "practitioners" are often complete blockheads who shouldn't be allowed near a classroom. Even where a practitioner is for real and is at the cutting edge of developments, he may not be able to teach or communicate worth a damn.
 
Using "practitioners" as a criterion is problematic. Programs on shoestring budgets are using cheap-to-employ "practitioners" to cover for not having real staff. And these "practitioners" are often complete blockheads who shouldn't be allowed near a classroom. Even where a practitioner is for real and is at the cutting edge of developments, he may not be able to teach or communicate worth a damn.

It think we will just have to go with the assumption that in general it is advantageous. Another reason as to why it is beneficial to have practitioners, is because of possible internal referalls into their firms. I think this is very valuable if you have quality practitioners teaching. You're right in your point, but I think for some of the variables some assumptions will have to be made or we will have nothing to compare.

In terms of cheap teachers, I will also include professors who are tenured / tenure track faculty who held high positions in the industry prior to "retiring" and joining academia to teach in such programs. The minimum of 5 will include both possibly.
 
... is because of possible internal referalls into their firms. I think this is very valuable if you have quality practitioners teaching.

what is a "quality practitioner"? You might have a practitioner that might not fall in your "quality" category but he/she has a lot of connections that translate nicely into a lot of jobs.
 
what is a "quality practitioner"? You might have a practitioner that might not fall in your "quality" category but he/she has a lot of connections that translate nicely into a lot of jobs.

I initially thought of "quality" in the sense that they work/worked in a known firm in a high position, but you're right, in your statement.

It will include ANY practitioner. Assumption is they are ALL of high quality.
 
I would restart this thread with the question of what qualities make an MFE program great and see what kind of consensus can be reached by the members of the forum.

After that, it might make sense to generate school rankings for each quality identified and then an overall ranking (which might be more subjective in nature).
 
I would restart this thread with the question of what qualities make an MFE program great and see what kind of consensus can be reached by the members of the forum.

After that, it might make sense to generate school rankings for each quality identified and then an overall ranking (which might be more subjective in nature).

There is already a general set consensus on what are good gauges for MFE programs. I am trying to set some proxies in this post and possible weighting for each of the variables. I have gone through many replies by forum members when students ask about comparing programs. It would be redundant to ask that question again and it can be seen on several threads of what members think about qualifying a good program.

If I left the question open-ended ... it would take forever to set on a general set of variables as many of them can be highly subjective. That is why i have decided to outline a few gauges and am taking inputs on additional ones.

---------- Post added at 01:08 PM ---------- Previous post was at 01:04 PM ----------

So we are going to include ANYONE, and assume ANYTHING, won't we :)

Just to show you how tricky to come up with a good methodology.

If I were you, I would start with the hard cold dataset and see how I would go from there.


Well...there have to be some assumptions. There has to be some ground rules. Some of the most famous methodologies in any field have some huge assumptions. Everyone wants an answer with no assumptions, but that almost never happens. The question is how you proxy to get results that follow the mainstream theories.

At the end if the rankings make sense we will know how good the model is. If it ends up showing that Random University is #1, we will know it is not a good methodology, but if it stays consistent with what the general public thinks the rankings will further fortify what the reality is. The important thing with these rankings will be to get a general comparison of a range of MFE/MSFM programs as majority of the questions asked by students are not usually among the top programs, but among the not so top 10 programs.

The rankings will be backed by some relevant gauges and should give a good picture. We will know in time...
 
Back
Top