rankings

US News and World Report 2017 MBA ranking

When number of options is huge, it is important to have some tools that will help you with your decision. More often than not prospective students learn about their options through MBA rankings, but that’s only one of the steps, after taking a look at the rankings students need to research each of the schools separately, employment reports, regional placements, curriculum, quality of student body, flexibility, potential impact on you as a person and a professional etc. So it only follows, that after students look at the rankings and chose 5-7 schools close to each other (a few stretches, reaches and safe schools), they should not be surprised by differences when they look into employment reports and other detailed data. And that’s one of the reasons US News and World Report ranking is the most popular ranking tool – there are not a lot of fluctuations and similar schools are grouped together (when I just started looking at MBA programs I didn’t know about how strong Kellogg is because I used FT rankings, which have very little to do with reality, so if not for further research I wouldn’t have known that Kellogg is a top 5 school). But this year it might be different, at least in top-20. A few upsets made me question this ranking and reassess its value.

Booth tied with Stanford at #2

This is the first thing that will catch an eye of someone familiar with top schools’ reputations. Yeah, HBS took #1 again, but that’s not something that will surprise, but Booth having a largest leap among M7: it surpassed Wharton and tied with Stanford, is interesting. Seems like placement % has a huge impact among top-5 schools. If we take a look at their reports, it seems that Stanford students take more time to look for jobs:

2016-03-21 15_21_24-Microsoft Excel - MBB to consulting

MBA Employment reports 2015 Booth / Stanford

18% of students looking for jobs accepted offers in the 3 months after graduation. In case of weaker schools I would have suspected that these students just had to look for jobs after graduation, because they couldn’t secure an offer earlier, but in case of elite schools, that reasoning is most likely not correct. This anomaly, I suspect is explained by PE/VC/HF and start up recruitment, they are the last to hire and require some additional time commitments from students, and if that guess is correct, than it tells us that Stanford students are not afraid of taking risks, which speaks volumes about their confidence on employment opportunities. Unfortunately that is not measured by US News report, making it a bit less trustworthy. I know for a fact that Booth career office encourages students to not accept offers they don’t like, as they are confident they can get their MBA’s better ones. It follows that not accepting an offer doesn’t necessarily mean a bad thing.

Yale jumps 5 places to #8 or Ted Snyder is at it again

I think this is the most talked change this year. I mean a lot of us predicted that Yale will enter Top-10 within a few years, but they did it in 2016 and not only did they enter top-10, they tied for #8 with Tuck and surpassed CBS at #10. I personally think this change was a bit premature, yes the school is great, but here is the data I extracted from employment reports and LinkedIn for the class of 2015:

2016-03-17 19_58_10-Microsoft Excel - Book1

MBB/Consulting ratio at top schools (GSB and HBS probably have even better results)

As homage to Big Mac Index, I used MBB/Consulting index to put employment reports in perspective. I didn’t do the same for IB due to decline in placement; I think in 2016 Consulting rules the MBA employment. And this small and a bit flawed report (can’t fully trust Linkedin, hence the lack of Wharton) shows the difference in QUALITY against quantity, something that MBA rankings don’t take into account, but they should. Yale is a great program, but it is not better than CBS, it should not be tied with Tuck and it is not that far ahead of Darden (if at all) this year. Probably, by 2020 Yale will be among top-8 schools due to merit, not flaws in rankings. Especially considering that rankings are self-fulfilling prophecies, if before Yale admits often were choosing between M7 and Top-15 school, now they will be choosing M7 vs Top-8, and while in first case it was a difficult choice to make, now it seems much easier and add to that Yale’s name, a name that is much better known than other M7 schools, with the exception of Harvard, Stanford and MIT.

New York schools drop in rankings, CBS #10 (-2) and Stern #20 (-9).

Though both schools are placed lower than they should be in my opinion, the reasons are quite different, CBS was poorly performing in this ranking for quite some time now, so they just continued drifting lower, though their employment reports are quite clearly top-8. As of NYU Stern, their dean wrote this article: http://www.stern.nyu.edu/experience-stern/news-events/us-news-2017-rankings and it kinda raises certain questions about US News ranking, is it really right to knowingly release wrong data? What if HBS or Stanford were in that situation? Are there other data points that were filled in using flawed ‘estimation’ process?

This situation with Stern is a good example of why people shouldn’t take rankings at face value, even they are prepared by US News and World Report, and considering that Bloomberg BW guys are not afraid of adapting and fixing flaws, I think they can take the place of most prestigious rankings for MBA’s if this year they will avoid mistakes similar to those listed.

Bloomberg Businessweek 2015 ranking analysis

Ok, a disclaimer: I really like it! Do I agree with it? No. Do I love the fact that they actually tried to address the problems they had? Yes.

Before we start:

Bloomberg BW 2015 ranking

I’m really interested in rankings and the fact that I scored most popular MBA rankings on the first day of this blog should say a lot about that.

As I have written just a few days ago their methodology was really bad, I assigned a score of 2/10 to it and recommended to stay away. No more!

I think there is no point in discussing changes in rankings of schools, because no school in top-25 has the same rank they had last year and that’s not because the schools changed, but because of the methodology.

Methodology

Well, this is the source reason for all of other changes, the methodology is substantially different from the old one, which was as follows:

Student survey (45%), Employer survey (45%), Research (10%)

Quoting my main points against it:

“Another ranking I don’t use, as I wrote before, research has nothing to do with the quality of Business school and 45% weight to graduating students? Seems like there might be some bias going on.

Any school that invests more time and money to “experience” will get a higher place than rigorous programs, that might explain how Duke Fuqua came before M7 + Tuck. It’s important to remember that students that get into Fuqua are different from those in HBS in terms of expectations, hence it is easier to satisfy them, while HBS students might consider getting into MBB and not PE/VC a fail (many of them come from MBB already).”

That didn’t even touch on every problem with the methodology, but overall correctly reflected my unwillingness to spend time using/discussing/thinking about Bloomberg BW ranking.

New one is much better: it incorporated the best points from other methodologies, but stayed true to BW survey based nature:

Employer Survey (35%), Alumni Survey (30%), Student Survey (15%), Job Placement Rate (10%), Starting Salary (10%).

Employer Survey (35%) decreased from 50%, but seems the approach didn’t. What I think is missing is ’employers rating’! Probably, the employers themselves should be ranked as some are more demanding than others, an MBB has higher expectations than a regional advisory, in former you compete with the cream of the crop and the latter is more diverse in terms of ability of employees, hence it’s easier to shine there. Maybe the students should be asked to rank their employer among direct competitors? I mean, some thought should be given to this, there are a lot of opportunities for improvement.

Student Survey went from 45% to 15% and, as expected, Duke fell down and HBS got higher, actually they swapped places, from 1th to 8th and vice versa. Funnily, these are the exact two schools I discussed in my blog a few days ago!

They added Alumni survey and gave it a considerable weight, after 2 happy years at their school, people lose the euphoria and are better able to assess the role the school has played in their professional lives. Still, different people have different expectations, Chicago is placed 29th and I’d like to know why exactly to better understand how Alumni survey works.

Introduction of Job Placement Rate. Ok, this one I don’t like much, they say that: “We define job placement rate as the percentage of graduates who secured full-time employment within three months of graduation, out of all graduates who sought it.”  And that’s a great approach, but let’s use common sense test, let’s compare an ultra-elite HBS (ranked 35) with some 2-3 tier school from the same region, so BU Questrom (31). So, how comes? Well, I don’t know exactly, maybe HBS graduates, even the ones seeking a job, after not being able to find a high-level position they expected, just choose to continue they searches or try to start a business or just take some time off, because they can afford it? I don’t know, but it seems unlikely that HBS students are less employable than Questrom students (no offence meant), so direct comparison of two is wrong and some adjustments should be made. Probably similar to what I have proposed in Employer survey discussion above.

Starting salary is another comparison point introduced this year and honestly it’s great, they have taken into account regional and industry variance and adjusted properly (at least they say so). Actually, now that I think about it, this in combination with Alumni survey can partially act as an ’employer’ rank that I proposed above! (see ‘Adjusted’ below).

Research was dropped, a brilliant idea that doesn’t need any more discussion.

Final ranking is calculated as weighted sum divided by top school, so the top school has 100% and every other school is compared to it. Interestingly I’ve recalculated and got different results, probably they went deeper than just top level rank analysis (and that’s great, but I wish they shared their calculations), but still I like mine better:

School BW 2015 Recalculated Adjusted*
Harvard 1 1 1
Chicago
(Booth)
2 5 11**
Northwestern
(Kellogg)
3 2 3
MIT
(Sloan)
4 4 5
Pennsylvania
(Wharton)
5 7 7
Columbia 6 8 9
Stanford 7 3 2
Duke
(Fuqua)
8 9 6
UC
Berkeley (Haas)
9 6 4
Michigan
(Ross)
10 14 16**
Yale 11 12 10
Virginia
(Darden)
12 10 14*
UCLA
(Anderson)
13 13 12
Dartmouth
(Tuck)
14 11 8
Emory
(Goizueta)
15 15 15

*using Alumni survey and Starting salary I have adjusted Employer surver report (higher the salary higher the expectations) and Job Placement report (higher the alumni survey and salary higher the prestige of jobs).

**these schools placed lower due to low alumni survey rankings.

But that’s just a fun exercise – don’t put much though into my recalculations, as I don’t have actual survey results they have used and seems like Alumni survey results need to be reassessed. Anything that has ‘survey’ in it should be evaluated very closely and I hope they will do just that.

What I really like is that they actually update and improve their methodology and are not afraid of that.

Anyways, I think that rankings are just that – rankings, consider them in your initial research, but never make the final decisions solely based on rankings.

Useful links:

Old methodology in detail: link

New methodology in detail: link