Staying on the topic of rankings, as we did last time, have you ever wished that somebody would throw all the various college rankings into a pot, stir them all up, and come out with a Grand Composite Ranking? Well, if you have, your wish may have come true.
Late last month, The Washington Post published an article by Nick Anderson entitled Here’s a new college ranking, based entirely on other college rankings. This merits some investigation, if for nothing else, to see how a composite reflects traditional separate rankings.
Of course, this article made its way onto the College Confidential discussion forum and generated some interesting comments. Let’s take a look at these new rankings, some of Anderson’s (and his readers’) comments, and those of some CC posters. Then, you can form your own conclusions.
Anderson opens his article with the following rationale …
Higher education leaders often call college rankings misleading. The rankers, they say, insert questionable data on schools into a subjective formula and produce numbered listings of “top colleges” that have only a veneer of validity and objectivity.
And yet people read, and often heed, the rankings anyways, giving them a surprising measure of influence and authority. In theory, the rankings deal with questions that people want answered.
Students value selectivity in admissions, and strong academics, and research prowess. They want a quality education at a good price, that will help them earn a good salary when they leave school, and not get stuck in debt. They admire schools that work for the public good and serve as an engine of social mobility for the disadvantaged to reach the middle class.
So why not take several rankings that seek to capture all of those qualities and combine them into one?
… and then gives us his combinational formula …
Start with two lists from U.S. News and World Report: national universities and national liberal arts colleges. These rankings, which are based in part on selectivity, wealth and reputational surveys, for decades have been the most prominent in the market.
For the national universities, add:
- National university rankings from Washington Monthly, which aim to gauge their contribution to the public good.
- Rankings from the Wall Street Journal/Times Higher Education, which include a focus on outcomes such as graduation rates, salaries and student engagement.
- Rankings from Times Higher Education world university analysts, only of schools within the United States, which focus on research prowess.
- Rankings from Money and Forbes magazines, which in different ways seek to measure value and outcomes. Both include data on salaries of alumni.
Take the sum and divide by six (the number of rankings in play). Order the results from low to high. Assign them each a new rank, from 1st to 121st. Here is what you get:
… which gives us his:
Combined 2016 university ranking:
[Drumroll] … Anderson’s Top 20 (he goes down to 121) combined university rankings:
|Massachusetts Institute of Technology||MA||private||3|
|University of Pennsylvania||PA||private||6|
|University of California at Berkeley||CA||public||9|
|California Institute of Technology||CA||private||10|
|University of Michigan at Ann Arbor||MI||public||11|
|University of California at Los Angeles||CA||public||11|
|University of Notre Dame||IN||private||17|
|University of North Carolina at Chapel Hill||NC||public||19|
Plus, his Combined 2016 liberal arts college ranking:
|Washington and Lee University||VA||private||11|
|Claremont McKenna College||CA||private||12|
|College of the Holy Cross||MA||private||16|
Naturally, such an unusual approach comes with cautions, as Anderson notes:
There are plenty of caveats to all this. Significantly, the Washington Monthly and U.S. News rankings are composed of separate lists of universities and liberal arts colleges. The other rankings are combined into one list. So that poses an obvious challenge for comparisons. Williams College, for instance, ranks 22nd on the Journal/THE combined list, 49th on the Money combined list and 1st on the U.S. News liberal arts list.
So, what do people think of this stirred-in-the-pot approach? Here’s my own combination of comments from Post readers and CC posters:
– None of these rankings really has much meaning in my opinion. It all really depends on what college major and field of study you’re talking about. If you’re an engineering major, then only certain schools rank well. But if you’re a major in music, then other schools rank well and so forth. Which schools for a particular major have an impact when you want to apply to other universities for a masters or PhD program? A college or university can be well known but that may only apply to a few majors. That same college or university can be well known and respected for a few majors and rank very low for others.
– What was the reason to leave out West Point, Annapolis and Air Force Academy?
– I always look at these, but really, how useful are these rankings? It’s apples to oranges in so many cases, depending on what a student needs and the character of the school. Oxbridge offers small seminars with world-class experts in very specialized courses, whereas Oberlin offers a near completely open curriculum with a wide range of choice. I suppose it doesn’t hurt if you go to a highly ranked one, but these skew the admissions process.
– There are something like 3000 4-year colleges and universities in the USA. Anything ranking in the top 300 is probably a good school. Anything ranking in the top 100-150 is probably a great school. Most people want to know academically how a school ranks – yes, affordability and graduation rate and research and cost all matter – but most kids look for a school that matches their academic abilities, as well as offer majors that are interesting or at least a wide variety of options. My son got into a high ranking school (listed above) but it did not offer either of his potential majors, so he didn’t bother. He ended up at a large, cost effective state school that I usually see at the bottom of the top 100 on most lists and he was able to explore ideas, major in something he loved, and get a job, all without a huge amount of debt. Win-win.
– This is a fairly minor reshuffling of the top US News national universities. Main differences: Berkeley, Michigan, and UCLA move up; UChicago, JHU and WUSTL move down. Otherwise, the overall pattern is the same. Ivies (and other highly selective, private universities) dominate the upper half of the top 50; big state flagships dominate the lower half. Within those two groups, the exact order is mostly a question of what criteria you choose and how you weight them (although HYPS usually wind up at or near the top; Berkeley and Michigan often out-rank other state universities).
– The Washington Post’s goal, presumably, was to integrate more factors into the ranking and to get away from ranking which weigh selectivity as a significant factor.
– I presonally think acceptance rates of state schools are inflated. Most state schools have an obligation to their state for the taxpayer money. OOS tuition and lack of financial aid is a huge turn off for OOS students. Therefore, they have a lower overall applicant pool based on price.
– It is one more tool for families looking beyond the dominance of UW News rankings. WaPo list takes into account other ranking systems, which include student outcomes and, for the universities, research. Interesting to see how the different formula moves some schools way up or down and some really don’t change much at all.
– There are no “ties” on this list, so schools with the same score (or whatever factor they use) will look better than another school with the same score.
Lots of different takes. What’s yours? Post your comment below.
Check College Confidential for all of my college-related articles.