Randolph-Macon College is a small liberal arts of less than 1,200 students just north of Richmond, Virginia. The college has two programs that I find appealing. It has a First-Year Experience that goes beyond freshman seminars. Undergraduate research appears to be prominent with it’s Schapiro Undergraduate Research Fellowship. And for those who ranking is important, its ranking in US News and World Report suggests that it may be an under-rated or undiscovered education value. It’s ranked 122 over all in its Liberal Arts Colleges list but is 176th in selectivity.
January 26, 2008
January 11, 2008
From my basic list of liberal arts colleges
I love Albion’s “Prospective Student” page. It starts with “do you believe that your first 18 years of life can be accurately and completely reflected in one admissions applications?” Well, of course not which is why you should visit Albion. Like most small, liberal arts colleges, it’s faculty gets high marks for quality and interaction. It appears to have some interesting academic programs and has a high percentage of students going to graduate school or med school after graduation.
See the complete post at my new website www.texasedspectator.com.
November 26, 2007
Just in case anyone is wondering what I’ve been doing instead of blogging, I’ve started the college search for my sophomore son. Notice, I didn’t say “assisting” him or “guiding” him, I’m the one doing it. He has absolutely no interest at this point.
So why bother? Why not just let him reach the point on his own and start looking himself? One, the way the college application process works now at days means that waiting may also close off opportunities you waited too long to find out about.
Two, after spending a lot of time on a homeschool to college yahoo group and wondering why everyone elses kids seem to care and mine doesn’t, someone pointed out that it seems to be more boys in the “don’t care” category. The group consensus was that boys generally take longer to mature and this is one area in which it shows. (I hope anyway)
Three, if I go from the premise that he’s not really a self-starter, then I had better find a college where he won’t get lost in the crowd.
Four, we aren’t going to qualify for any need aid and while my son isn’t a self-starter, he’s smart enough to qualify for some merit aid somewhere. I just need to figure out somewhere.
So how do you start looking? I’ve read the Colleges that Change Lives and looked at the Princeton Review top 20 lists and it’s a start. But there are over 1500 schools out there and that only scratches the surface.
I’ll tell you my current, evolving method. I start out at http://www.collegeresults.org. I do an institutional search for the following:
Size: 750-2500 (I think he’s going to need to be in a small school where people care if he shows up to class)
Student Related Expenditures per FTE: greater than $15,000. The most spent per student at any state school in Texas is less than $12,000. There are only 25 public schools nationally that spend more than $15,000 per student. There are 290 private schools that do. I figure if I’m going to shell out tuition for a private school, I want to see some of the money spent on the students whether on student organizations, dorms, athletic facilities (the one thing he does care about) or classrooms.
Graduation Rate: Ideally, it should be over 70%. However, I’m currently working with a search between 50% and 70%. This generally lowers the requirements for qualifying for full tuition scholarships at the school. If he can get a scholarship and in their honors program with less than a 1200 SAT, I’ll take the chance.
After I generate my list, I then plug the school in the Princeton Review’s website for more info on it’s acceptance rate, percentage of students living on campus, percentage going to graduate school, and does it have baseball and football (because that’s important to my son, it may not be important to yours).
Then I hit the school’s website to look for information on it’s history department, (if ds can’t be general manager of a pro football team, he wants to be a history professor), scholarships, and honors program.
And that’s how I cam across Colby-Sawyer College in New Hampshire. The fact that it doesn’t have a football team is made up by it’s Honor Scholarship program. 1150 is worth a look.
Wesson Honors Scholarship Students with a 3.5 GPA and 1150 SAT (combined critical reading and math sections)/25 ACT score are eligible for the Wesson Honors Scholarship. This $12,000 scholarship includes direct admission into the Wesson Honors Program. It is renewable annually for four years of study dependent upon good academic standing in the Wesson Honors Program.
I’m not sure about the history department since it’s history department is actually “History, Society and Culture” which they do a nice job of explaining. It doesn’t seem particularly strong in the Civil War but okay in modern European history. On the plus side, it has developed an “Apprentice Historian Project.”
Other notables about the school is it’s co-curricular transcript, Pathway program, and use of portfolios.
I’m not sure I want ds in New Hampshire; I’m not sure he would even want to go. But it’s saved to the Princeton Review profile and on my “watch” list. It seems a promising alternative for someone who is not going to make into UT under the top 10 percent rule.
August 21, 2007
BACK TO SCHOOL….U.S. News & World Report publishes its university rankings every year, and every year people complain about them. So starting in 2005 we decided to do more than just complain, and instead came out with our own rankings — based not on reputation or endowment size, but rather on how much of a contribution each university actually makes to the country. This year’s #1 school? Texas A&M.
Maybe the fact that Texas A&M comes in first in the Washington Monthly rankings as opposed to 62nd in the U.S. News & World Report rankings will get the attention of some Texans. UT Austin moved up to 19th in the Washington Monthly over it’s 44th place in the U.S. News ranking. Take a look, it’s definitely a different way of evaluating what constitutes a “good” school.
August 17, 2007
U.S. News and World Report’s annual college rankings is out and of course, everyone bemoans the
shallowness limitations of the rankings. Yet the education establishment can console themselves with the fact that there aren’t any alternatives so it’s okay to use the rankings anyway. Even other media sources seem to reinforce this attitude:
But if the rankings are harmful, what’s the alternative? To date, there hasn’t been one, since colleges and universities haven’t devised their own quality-assessment system for the public.
Except there are alternatives. Enter “college rankings” into Google and while the first four results all have to do the U.S. News and World Report rankings, the fifth one is
College Rankings, Education and Social Science Library, U of I This site provides a compendium of links to online college ranking services for undergraduate and graduate programs, business schools, law schools, … http://www.library.uiuc.edu/edx/rankings.htm – 7k – Cached – Similar pages
which references ten college rankings. The sixth result is The Princeton Review ranking, the seventh, a website of student rankings, and the eighth is the Washington Monthly College Rankings. Why none of these are considered viable alternatives to the U.S. News and World Report rankings I don’t know.
I really shouldn’t be surprised since about the same time last year I was complaining about the reporting concerning the value of the rankings so why would I expect things to change after a year? Oh well. If you’re interested in a different perspective on college rankings, I highly recommend the Washington Monthly College Rankings. I look forward to reading their list next month.
April 28, 2007
I’ve been spending way too much time on the College Results Online website but it provides basic information that you won’t find on any college website. I’ve actually downloaded some of the Texas data into a spreadsheet to take a closer look at patterns and such–what can I say, my thesis was on dropout statistics?
Anyway, some interesting information from 2005.
- Number of students in public four year institutions listed on the website: 316,417
- Percentage of those students enrolled in schools with less than a 50% six year graduation rate: 63%
- Percentage of students in public four year institutions considered under-represented miniorities: 35%
- Percentage of under-represented miniorities enrolled in public schools with less a 50% six year graduation rate: 80%
- Student Related Expenditures per FTE for UTSA in 2002: $5,752
- Student Related Expenditures per FTE for UTSA in 2005: $5,396
- Student Related Expenditures per FTE for UT-Austin in 2002: $9,205
- Student Related Expenditures per FTE for UT-Austin in 2005: $11,344
- Number of public schools that had a decrease in student related expenditures from 2002 to 2005: 5
- Number of public schools that had an increase in student related expenditures from 2002 to 2005: 21
March 11, 2007
Very interesting article.
Since then, the growth in merit-based aid at these places has outpaced that of need-based aid in an effort to attract these upper-middle-class students with higher board scores who will make a school more competitive. While some merit money is mixed with need, the trend is clear and results scandalous.
College rankings exacerbate this noxious development. Blame rankings on those odious annual lists U.S. News & World Report dreamed up to sell magazines. Otherwise sane academic leaders drank the Kool-Aid to look better.
Listen to Tufts president Lawrence Bacow, who offers zero merit dollars: “It is far from clear to me how society is better off when scarce financial aid resources are diverted from the neediest students to those who are not needy by any measure, simply to redistribute high scoring students among our institutions.”
Baum, among many, cites Washington University in St. Louis for its extensive use of merit aid: “It didn’t have to do it. That’s a choice. That’s about rankings.” (Washington U. would not give me numbers on its student aid, which, in my book, is akin to refusing a breathalyzer. Closer to home, Simmons did the same and Emerson never got back to me.)
There are people that are trying to focus more attention on the goal of a college education as oppose to it’s selectivity. The Washington Monthly has created their rankings that includes a “Social Mobility” score.
And so, to put The Washington Monthly College Rankings together, we started with a different assumption about what constitutes the “best” schools. We asked ourselves: What are reasonable indicators of how much a school is benefiting the country? We came up with three: how well it performs as an engine of social mobility (ideally helping the poor to get rich rather than the very rich to get very, very rich), how well it does in fostering scientific and humanistic research, and how well it promotes an ethic of service to country. We then devised a way to measure and quantify these criteria (See “A Note on Methodology”). Finally, we placed the schools into rankings. Rankings, we admit, are never perfect, but they’re also indispensable.By devising a set of criteria different from those of other college guides, we arrived at sharply different results. Top schools sank, and medium schools rose. For instance, Pennsylvania State University, University Park, 48th on the U.S News list, takes third place on our list, while Princeton, first on the U.S. News list, takes 43rd on ours. In short, Pennsylvania State, measured on our terms–by the yardstick of fostering research, national service and social mobility–does a lot more for the country than Princeton.
If you get what you measure, would we better off with colleges striving to make the U.S. News and World Report rankings or the Washington Monthly rankings?
August 19, 2006
College rankings have made it to the front page of the Express-News.
“I think this rankings business has become kind of a cottage industry,” Gates said. “It’s hard to keep these things straight.”
And the reporting by the Express-News isn’t going to help anyone keep it straight either although the article itself hints about possible means.
The University of Texas System is one of a few that releases results of the Collegiate Learning Assessment, said Geri Malandra, interim executive vice chancellor for academic affairs.
And what does it say? Would it have required time consuming phone calls to the UT System and entering numbers into a spreadsheet? No! It would have only required reading the article accompanying the rankings in the Washington Monthly Report. The newspaper apparently had no problem publishing the more “visible” US News and World Report Rankings. Just in case anyone is wondering what the Washington Monthly said:
The University of Texas System, however, has made results public, and they’re surprising. The CLA tests freshmen and seniors, gauging the amount of learning students gain during their college careers. Senior scores are also compared to the scores predicted by students’ ACT or SAT results. The best Texas university by this measure isn’t the flagship, highly ranked UT-Austin campus. The biggest gains are occurring at UT-San Antonio, UT-El Paso, and UT-Permian Basin, all of which are at the bottom of the U.S. News rankings.
But according to the presidents of two Texas schools that don’t participate/release their results, such testing isn’t a good idea.
Brazil and Gates both warned that a one-size-fits-all test would be difficult because of the diversity in higher education.
So why wouldn’t the National Survey of Student Engagement, which the Express-News article mentions, be appropriate for all universities including Trinity and Texas A&M? The Washington Monthly article provides some explanation.
This evaluation, called the National Survey of Student Engagement (NSSE), was launched two years later, with over 275 colleges and universities participating. As of 2006, nearly 1,000 colleges have been evaluated, each receiving a detailed statistical analysis of how well its students are being academically engaged. Housed at Indiana University and administered annually at a cost to each college of as little as $1.50 per student surveyed, NSSE not only shows colleges how well they’re performing but how they stack up against the competition–for instance, whether their school ranks above or below average among peer institutions for faculty providing prompt feedback to students about their work.
Edgerton and Pew convened the original 1998 meeting looking for an alternative to the U.S. News rankings. But after investing over $3.5 million to develop and roll out the survey they wanted NSSE to be widely used and financially self-sustaining. That meant getting a lot of institutions to both agree to participate and pay for the privilege. Many were willing, on one condition: the results would be kept in-house and away from public eyes. Institutions knew that public data would inevitably be used to rank and compare colleges. They didn’t know where the survey would put them and were worried about looking bad relative to their peers.
And apparently, with good reason.
It’s understandable that the higher-education establishment–in particular the elite, sought-after schools–would have deep qualms about giving prospective students access to NSSE results: By all indications, that data does much to undermine those schools’ claims of superiority. Though NSSE doesn’t release data about individual institutions, it does release studies based on that data. In a 2005 report, NSSE analysts found no statistically significant relationship between effective teaching practices and admissions selectivity as rated by the popular Barron’s Guide to Colleges. Like the CLA, NSSE suggests that the long-established higher education pecking order may have little to do with who educates students best.
It seems that there are even more indicators of college quality that didn’t make it into the article. I’m in the middle of reading Loren Pope’s Colleges that Change Lives and there seems to be quite a few rankings somewhere on the number/percentage of students admitted to graduate school, medical school, MBA programs, and so on. (FYI, two Texas schools appear in the book.) Unfortunately, he never references the actual rankings–might be bad for Harvard?
In any case, I seriously doubt if the reporter actually read any of the articles, especially the ones in the Washington Monthly. Maybe education reporting is more about listing the “visible” quotes than providing actual information. No wonder no one reads the papers any more.
August 7, 2006
With all the controversy surrounding NCLB and average yearly progress, it’s no wonder that no one wants to talk about accountability in college programs. And besides, everyone knows which colleges are the good ones. College reputation among other college officials accounts for 25% of the US News and World Report ranking.
But there is something to be said in measuring school effectiveness (what’s done with the measurements is another matter.) The latest edition of the Washington Monthly has it’s second Annual College Guide and it’s worth reading.
A year ago, we decided we’d had enough of laying into U.S. News & World Report for shortcomings in its college guide. If we were so smart, maybe we should produce a college guide of our own. So we did. (We’re that smart.) We’ve produced a second guide this year–our rankings for national universities and liberal arts colleges–and it’s fair to ask: Is our guide better than that of U.S. News?