As the admission season begins, the ranking of institutions will soon follow, with B-schools being ranked the most. While some of the rankings are distinctly misleading, one would find it very difficult to question the integrity of players like India Today, Outlook, Businessworld etc.
|B-school rankings in comparison
If it is so, as the table alongside shows, why is it that the ranking of most B-schools swings like a pendulum from one extreme to the other? The elite Indian School of Business (Hyderabad) too, suffers the swing, getting a lowly 13 in Business Today 2009 to an ok 4 in Business India 2009. The variability in the ranking goes up substantially as one moves downwards in the listing. Careers360 goes behind the numbers, and this is what we found.
What goes wrong?
An old hand at opinion surveys, BT conclusively proves what is wrong with  business school rankings. While we are not advocates of consistency, the kind of variability this ranking displays is to be seen to be believed. It has the largest number of odd men out when we analysed the Top 20 (see table above). Another classic example of perception playing havoc is the recent Outlook survey, which has jumped on to the perception-based ranking bandwagon from this year. NITIE lost 17 places, Amrita (which was in the top bunch last year) lost about 45 places, just to identify a few prominent losers.
And less said the better, about the surveys conducted by Zee Business and Mail Today. If quotes attributed to Samir Ahluwalia, Editor, Zee Business are to be believed, parameters like academic papers, books in the library etc. are debatable and hence not worthy of consideration, but factors like Global Placements, International Internships or Global Faculty must be critical elements. When you compare the results with the parameters they claim to consider, you could draw your own conclusions on the integrity of their survey.
The same applies to the Mail Today survey that grants 40% weight to perception and 60% to information provided by institutes. It conveniently ignored leading institutes like Faculty of Management Studies (FMS), citing late submission of data. If the objective is to give readers the right advice, it fails miserably. It may as well have termed it an advertorial. No wonder, institutes in the habit of lying, came on top in quite a few categories.  But it may be difficult to lay all the blame on perception, since results of surveys like that of Pagalguy.com, a known MBA website, are very close to objective rankings.
What objective rankings say
The situation is slightly better when it comes to objective rankings. But even here, Business India, a longtime player in the game, gets away by creating large enough clusters, so that very few schools have reason to complain (its A+ cluster, in 2009 has 67 schools). But such a ranking is of not much use to the student, since he does not study in a cluster. He studies in a school, and he needs to know how one compares to another. Or else the cluster must be small enough to fit the decision-making set of a student, which is not more than 10 schools at best.
Where do the surveys go wrong?
A listing of parameters that are being messed up due to lack of either access or understanding.
Ranking of schools rather than programmes: A student goes for a programme. The general PGDM at MDI attracts a very different audience vis-a-vis PGDM (IM) or PGDM (HR). No ranking captures this, barring probably the ones done by coaching institutes like T.I.M.E.
Quality of students: Any institution can pay the required fee and get the CAT score for shortlisting candidates. So granting marks for it means very little. The cut-off percentile counts here, the relative percentile within the broad clusters counts much more. The process, level of transparency, number of lists that schools put out, number of seats and refund policy are much more crucial.
Quality of Faculty: The number of PhDs is a good indicator for the robustness of teaching, but for a student, number of faculty with hardcore industry experience scores any day. Most surveys let the institutes get away with a bland category called 'faculty with industry and training experience'.
Infrastructure fixation: An AC classroom, or a swimming pool, is great, but its absence does not hurt. But the absence of a residential facility does hurt. So, relative weights must be used to determine what adds value, which is not visible.
Placements: The average salary must never be the average of domestic and international. For example: BW slips when it calculates the all crucial ROI. It adds everything together. Domestic and international salaries are absolutely incomparable, unless normalised.
International placements: Sales executives, counter salesman workshop managers, ad sales coordinators, these are some of the jobs we found in the placement records of some institutions, offering a package of 3-5 lakhs per annum. Do they count just because these jobs are based abroad?
Salary structure: Another gimmick is declare a 12 lakh package, where in 5 is the salary and 7 is the incentive. The ratings must highlight this.
Output rating: A B-school has three kinds of outputs: research, corporate training, and placements. They must be rated independently, but slotted together in one broad parameter. Let&rsquos not even venture into a social value debate here.
Validation of data: Until a time when regulatory agencies make it mandatory for institutions to publish data in the public domain, and hold them accountable for any lapses, it becomes imperative that submissions by institutions be validated, by the magazine or party conducting the survey.
Outside agency fixation: Magazines have a tendency to hide behind the fact that an independent agency did the survey. They almost present an impression that, since it is independent, they can conveniently wash off their hands. People identify with the magazine, not the agency. The parameters, differential weights, scoring and final reportage, each must have the imprint of the magazine. Ultimately the ownership lies with the magazine.
Parameter definition: Infrastructure and placements are the only two parameters that all surveys have in common. The rest that goes into the value-addition process of a B-school, is often structured in a rather erratic manner. For instance, slotting quality of students, which is an input alongwith placements which is distinctly an output, reflects a certain level of intellectual laziness.
The foreign collaboration deal: Any institution can take its students for a one-week junket to a foreign college, if they have the money. That college would also happily oblige for a few pennies. If we count this as collaboration, then we have only ourselves to blame. Another culprit is the number of MOUs. It is just another number, unless fully operationalised.
Data on defaulters: Every year BusinessWeek publishes a list of institutions, which did not participate. Barring probably an apologetic note about the holy trinity not participating but still being ranked, One is yet to see a list which says these are the defaulters.
Surveyors' quality: B-schools are not soaps or hair oil. But many research agencies treat it as such. Look at the rookies they send out as surveyors! If they are not able to differentiate between number volumes and titles in the library, to give an example, the quality of data they bring in is suspect.
On paper, the best among the lot in terms of methodology is the Businessworld ranking, though it could do with substantial improvement in its methodology and parameters.
Lack of information, also a hurdle
However, it is not fair to blame the media completely. The absolute lack of transparency on the part of the B-schools when it comes to sharing crucial data does make it impossible to develop a sensible methodology. Some schools almost behave as if placements are a national secret, and revealing the usage of placement consultants is akin to selling a nuclear plan! But the answer lies not in taking shortcuts. That would mean GIGO. (Garbage in Garbage out).
The way out
We are working with a team of experts to identify a way out we will come back to you with more details in the next issue. But as a magazine sworn to standing up for the student, we would be interested in knowing what you think. Write in to us at email@example.com.
(We analysed the top players in the ranking business. Mail Today/Zee Business feature in the tables only as an indicator of what is wrong. If any of the publications mentioned in the story, identifies factual inaccuracies, they are most welcome to write in to us.)