College rankings have made it to the front page of the Express-News.
“I think this rankings business has become kind of a cottage industry,” Gates said. “It’s hard to keep these things straight.”
And the reporting by the Express-News isn’t going to help anyone keep it straight either although the article itself hints about possible means.
The University of Texas System is one of a few that releases results of the Collegiate Learning Assessment, said Geri Malandra, interim executive vice chancellor for academic affairs.
And what does it say? Would it have required time consuming phone calls to the UT System and entering numbers into a spreadsheet? No! It would have only required reading the article accompanying the rankings in the Washington Monthly Report. The newspaper apparently had no problem publishing the more “visible” US News and World Report Rankings. Just in case anyone is wondering what the Washington Monthly said:
The University of Texas System, however, has made results public, and they’re surprising. The CLA tests freshmen and seniors, gauging the amount of learning students gain during their college careers. Senior scores are also compared to the scores predicted by students’ ACT or SAT results. The best Texas university by this measure isn’t the flagship, highly ranked UT-Austin campus. The biggest gains are occurring at UT-San Antonio, UT-El Paso, and UT-Permian Basin, all of which are at the bottom of the U.S. News rankings.
But according to the presidents of two Texas schools that don’t participate/release their results, such testing isn’t a good idea.
Brazil and Gates both warned that a one-size-fits-all test would be difficult because of the diversity in higher education.
So why wouldn’t the National Survey of Student Engagement, which the Express-News article mentions, be appropriate for all universities including Trinity and Texas A&M? The Washington Monthly article provides some explanation.
This evaluation, called the National Survey of Student Engagement (NSSE), was launched two years later, with over 275 colleges and universities participating. As of 2006, nearly 1,000 colleges have been evaluated, each receiving a detailed statistical analysis of how well its students are being academically engaged. Housed at Indiana University and administered annually at a cost to each college of as little as $1.50 per student surveyed, NSSE not only shows colleges how well they’re performing but how they stack up against the competition–for instance, whether their school ranks above or below average among peer institutions for faculty providing prompt feedback to students about their work.
Edgerton and Pew convened the original 1998 meeting looking for an alternative to the U.S. News rankings. But after investing over $3.5 million to develop and roll out the survey they wanted NSSE to be widely used and financially self-sustaining. That meant getting a lot of institutions to both agree to participate and pay for the privilege. Many were willing, on one condition: the results would be kept in-house and away from public eyes. Institutions knew that public data would inevitably be used to rank and compare colleges. They didn’t know where the survey would put them and were worried about looking bad relative to their peers.
And apparently, with good reason.
It’s understandable that the higher-education establishment–in particular the elite, sought-after schools–would have deep qualms about giving prospective students access to NSSE results: By all indications, that data does much to undermine those schools’ claims of superiority. Though NSSE doesn’t release data about individual institutions, it does release studies based on that data. In a 2005 report, NSSE analysts found no statistically significant relationship between effective teaching practices and admissions selectivity as rated by the popular Barron’s Guide to Colleges. Like the CLA, NSSE suggests that the long-established higher education pecking order may have little to do with who educates students best.
It seems that there are even more indicators of college quality that didn’t make it into the article. I’m in the middle of reading Loren Pope’s Colleges that Change Lives and there seems to be quite a few rankings somewhere on the number/percentage of students admitted to graduate school, medical school, MBA programs, and so on. (FYI, two Texas schools appear in the book.) Unfortunately, he never references the actual rankings–might be bad for Harvard?
In any case, I seriously doubt if the reporter actually read any of the articles, especially the ones in the Washington Monthly. Maybe education reporting is more about listing the “visible” quotes than providing actual information. No wonder no one reads the papers any more.