Fall semester usually brings attention to college rankings. Most years, this happens in conjunction with the release of the annual “Best Colleges” issue of U.S. News & World Report. This year, the recent revelation that administrators at Emory University had misreported their institutional data for years (presumably to improve the university’s national rankings) has also drawn media interest.
Until a few years ago, USN&WR’s rankings were a complete non-event for Southern Polytechnic. SPSU was grouped in the category of “schools with a specialty,” along with about 15 other colleges and universities that emphasized engineering and technology. The schools were unranked, and although being listed with institutions like Colorado School of Mines and Cooper Union meant SPSU was in good company, the grouping didn’t provide much useful comparative information.
In 2007, SPSU moved out of the specialty category and onto the list of “Best Universities – Master’s – South.” We were grouped in the third tier, which was unranked. Through what was apparently an erroneous report, SPSU was also listed that year as one of the schools whose students graduated with the most debt – a category in which we neither want to be ranked nor actually belong.
In the 2011 rankings (published in August 2010), SPSU moved into the “top 100” regional universities in the south, with a ranking of 81. As I noted at the time, some of that improvement was a result of a change in how USN&WR calculated the rankings, but we all agreed that we’d rather be ranked in the top 100 schools than grouped in the third tier.
The 2013 rankings were released this past week. SPSU was listed at number 80 for master’s-degree -granting institutions in the south.
A particularly controversial aspect of the ranking system is the “peer assessment score,” which is a reputational metric that purports to represent how “good” the listed institutions are, as rated by presidents, chief academic officers, and admissions directors at the other schools in the same category. USN&WR justifies this category – which accounts for 25% of the overall scoring – by saying that it “allows top academics – presidents, provosts, and deans of admission – to account for intangibles at peer institutions such as faculty dedication to teaching.” In reality, this practice has led to an escalation of glossy publications targeted at the people who participate in the survey in an effort to improve the impression of institutions. (Note: SPSU does not participate in this activity.)
The president of my undergraduate alma mater, Dickinson College, has been an outspoken critic of college rankings, particularly the reputational scoring. He recently commented, “It is a dubious instrument and just plain silly. Lots of colleges rate themselves supreme and put everyone else in the bottom. It’s a ridiculous process.” USN&WR admits this happens; the “2013 Best Colleges” edition refers to this phenomenon as “strategic voting,” and the methodology attempts to offset the effect in the ranking process. Dickinson has not participated in the peer assessment for a number of years, but USN&WR still includes the college (ranked 46 under “national liberal arts colleges” for 2013).
So what do the 2013 rankings tell us? For SPSU, the ranking itself isn’t the important aspect; it’s what we can learn from the data. Overall, SPSU looks reasonably competitive in most of the major categories. Our peer assessment score is in the middle of the pack, our average freshman retention is fair, and the large number of small classes and small number of large classes help our scores. The SAT scores for entering freshmen and the percentage of freshmen in the top 25% of their high school classes also contribute positively. The acceptance rate makes us look less selective than other schools, but this is a tricky measure; colleges that clearly and effectively communicate their admissions requirements are likely to have a higher acceptance rate because students make more informed choices when they choose to apply.
There are two areas in which SPSU’s numbers move us lower in the rankings than we might be otherwise. These deserve our attention – not because of the rankings, but because these are important measures of our effectiveness as an institution of higher education.
One of these is the average graduation rate of students. SPSU’s six-year graduation rate of 32% puts us within two schools of being at the bottom of all the regional universities in the south that are ranked by USN&WR. (Graduation rate accounts for 20% of the total ranking.) We already know this is a serious issue, and we have a number of plans in place to improve this number, involving faculty and staff across the university. We need to focus on this because it is an important measure of student success and our ability to graduate students. We are not focusing on it because of the rankings, but the numbers behind the rankings highlight the problem.
The other category in which SPSU is at the bottom of the list is “Average alumni giving rate.” These rates are defined as “the average percentage of living alumni with bachelor’s degrees who gave to their school during 2009-10 and 2010-11, which is an indirect measure of student satisfaction.” Among the 100 universities in our region, the reported alumni giving rates ranged from 30% to 1%. SPSU’s rate was 2%. If we have 20,000 living alumni, that means just 400 graduates (who earned bachelor’s degrees) contributed to SPSU. Can that be right? Why aren’t the alumni supporting the university more? Are we even reporting the data correctly? This is an important question to understand – and the reason is not about improving our rankings. It’s about meeting the needs of our students and alumni.
USN&WR also posted some web-exclusive rankings for its 2013 edition of “Best Colleges.” SPSU was listed #6 in the category of ethnic diversity at regional universities in the south. This validates our sense that we do, indeed, have a diverse community. We value this characteristic of our community, and we’ll continue to support and develop this.
We are not going to fixate on the rankings, nor are we going to use them in headlines. (“We’re Number 80!” just doesn’t have the sort of compelling ring I’m looking for.) But we are going to learn from the data and figure out how to do a better job of serving our students and fulfilling our academic mission. Increasing our graduation rate and our alumni giving are worthy goals in their own right.