Ok, a disclaimer: I really like it! Do I agree with it? No. Do I love the fact that they actually tried to address the problems they had? Yes.
Before we start:
I’m really interested in rankings and the fact that I scored most popular MBA rankings on the first day of this blog should say a lot about that.
As I have written just a few days ago their methodology was really bad, I assigned a score of 2/10 to it and recommended to stay away. No more!
I think there is no point in discussing changes in rankings of schools, because no school in top-25 has the same rank they had last year and that’s not because the schools changed, but because of the methodology.
Well, this is the source reason for all of other changes, the methodology is substantially different from the old one, which was as follows:
Student survey (45%), Employer survey (45%), Research (10%)
Quoting my main points against it:
“Another ranking I don’t use, as I wrote before, research has nothing to do with the quality of Business school and 45% weight to graduating students? Seems like there might be some bias going on.
Any school that invests more time and money to “experience” will get a higher place than rigorous programs, that might explain how Duke Fuqua came before M7 + Tuck. It’s important to remember that students that get into Fuqua are different from those in HBS in terms of expectations, hence it is easier to satisfy them, while HBS students might consider getting into MBB and not PE/VC a fail (many of them come from MBB already).”
That didn’t even touch on every problem with the methodology, but overall correctly reflected my unwillingness to spend time using/discussing/thinking about Bloomberg BW ranking.
New one is much better: it incorporated the best points from other methodologies, but stayed true to BW survey based nature:
Employer Survey (35%), Alumni Survey (30%), Student Survey (15%), Job Placement Rate (10%), Starting Salary (10%).
Employer Survey (35%) decreased from 50%, but seems the approach didn’t. What I think is missing is ’employers rating’! Probably, the employers themselves should be ranked as some are more demanding than others, an MBB has higher expectations than a regional advisory, in former you compete with the cream of the crop and the latter is more diverse in terms of ability of employees, hence it’s easier to shine there. Maybe the students should be asked to rank their employer among direct competitors? I mean, some thought should be given to this, there are a lot of opportunities for improvement.
Student Survey went from 45% to 15% and, as expected, Duke fell down and HBS got higher, actually they swapped places, from 1th to 8th and vice versa. Funnily, these are the exact two schools I discussed in my blog a few days ago!
They added Alumni survey and gave it a considerable weight, after 2 happy years at their school, people lose the euphoria and are better able to assess the role the school has played in their professional lives. Still, different people have different expectations, Chicago is placed 29th and I’d like to know why exactly to better understand how Alumni survey works.
Introduction of Job Placement Rate. Ok, this one I don’t like much, they say that: “We define job placement rate as the percentage of graduates who secured full-time employment within three months of graduation, out of all graduates who sought it.” And that’s a great approach, but let’s use common sense test, let’s compare an ultra-elite HBS (ranked 35) with some 2-3 tier school from the same region, so BU Questrom (31). So, how comes? Well, I don’t know exactly, maybe HBS graduates, even the ones seeking a job, after not being able to find a high-level position they expected, just choose to continue they searches or try to start a business or just take some time off, because they can afford it? I don’t know, but it seems unlikely that HBS students are less employable than Questrom students (no offence meant), so direct comparison of two is wrong and some adjustments should be made. Probably similar to what I have proposed in Employer survey discussion above.
Starting salary is another comparison point introduced this year and honestly it’s great, they have taken into account regional and industry variance and adjusted properly (at least they say so). Actually, now that I think about it, this in combination with Alumni survey can partially act as an ’employer’ rank that I proposed above! (see ‘Adjusted’ below).
Research was dropped, a brilliant idea that doesn’t need any more discussion.
Final ranking is calculated as weighted sum divided by top school, so the top school has 100% and every other school is compared to it. Interestingly I’ve recalculated and got different results, probably they went deeper than just top level rank analysis (and that’s great, but I wish they shared their calculations), but still I like mine better:
*using Alumni survey and Starting salary I have adjusted Employer surver report (higher the salary higher the expectations) and Job Placement report (higher the alumni survey and salary higher the prestige of jobs).
**these schools placed lower due to low alumni survey rankings.
But that’s just a fun exercise – don’t put much though into my recalculations, as I don’t have actual survey results they have used and seems like Alumni survey results need to be reassessed. Anything that has ‘survey’ in it should be evaluated very closely and I hope they will do just that.
What I really like is that they actually update and improve their methodology and are not afraid of that.
Anyways, I think that rankings are just that – rankings, consider them in your initial research, but never make the final decisions solely based on rankings.
Old methodology in detail: link
New methodology in detail: link