There’s a new book out called Academically Adrift: Limited Learning on College Campuses that got a lot of ink last week. It claims that only 55% of American college students improve their scores on a standardized test of critical thinking in their first two years of college, and that only 64% of students improve on that test during their entire time as undergrads.
I haven’t read the book yet, so I can’t speak in too much detail about its contents, but a few things leap out from the coverage.
First, there’s the fact of what the study doesn’t measure. Because it’s based solely on performance on a generalized test, it tells us nothing about what students have learned in their own fields of study, a fact that many news stories on the book have failed to mention, or buried. (That misrepresentation began with the study’s authors — Richard Arum and Josipa Roksa — who titled a Chronicle essay on their work “Are Undergraduates Actually Learning Anything?” — a question which even their own work would lead them to answer with “Yes, definitely. At least 64% of them, and probably a lot more.”)
The study’s value is also, obviously, dependent on the value of the test itself. Very little of the book’s press coverage has explored the question of what, exactly, the test measures, and what counts as improvement under the authors’ interpretation of the results.
A third issue is what pollsters call “the internals” of the study — the breakdown of how the results differ across communities. Are the least-prepared students learning the most, or the least? Which kinds of campuses, which kinds of students, which majors, are most successful? These kinds of questions are essential to making sense of the study’s findings, and with the exception of some meager ethnic data, they’re absent from the coverage I’ve seen.
It’s also important to note that the study fails to situate its findings in a historical context. The test the authors rely on was first used in 2004, so it tells us nothing about whether today’s students are learning more than previous generations, less, or about the same amount. This is particularly important given the incredibly widespread (though largely ungrounded) belief in college students’ intellectual and moral decline: Anytime anyone says “today’s college students suck,” a lot of listeners are going to hear “…compared to those who came before.”
There are other problems with the study. More than half of the students who took the test as first-years, for instance, weren’t tracked down for the follow-up. The authors’ methodology in assessing how much time students spend studying has been criticized as well. But my point here isn’t so much to criticize the study — which, again, I haven’t read — as it is to point out the perils of relying too uncritically on its representation in the media.
2 comments
Comments feed for this article
January 24, 2011 at 1:08 pm
ReadyWriting
I think the other interesting aspect is how the majority of the comments left on the articles that discussed the book were basically lock-step in agreement about the assessment of our “current” undergraduates: universities are too expensive, professors are lazy/incompetent educators, kids today are dumb and unmotivated. You can make stats say anything you want; very few people pointed out that 64% success rate is actually not a bad number. Or, asked the question, if 64% is bad, what is a good number?
Which brings me to my next point, in particular the comments on forums like The Chronicle or Inside Higher Education, places where one can safely assume people who are visiting are in some way associated with higher education, thus well educated themselves (and good critical thinkers, etc). “Our” reaction was exactly the same, it seems, as the larger populations reaction (depending if the response was perhaps from an administrator or a professor). Why is it that there were so few sane voices, such as your own, that broke down the methodology and asked relevant questions to try and validate or invalidate the findings before issuing blanket statements on blame.
I think the authors are brilliant in how they have framed the debate (or rhetoric) around their book and their findings: they have some how managed to reinforce both popular and professional attitudes toward university, thus insuring that they will sell books. I’m curious to read the book, but I haven’t given up hope on the students or on us.
January 24, 2011 at 1:22 pm
Angus Johnston
ReadyWriting:
I’ve started to dig into the study itself, and found that its findings actually contradict a lot of the knee-jerk support it’s garnered in the media. Which, to underscore your comments, doesn’t exactly speak to the intellectual rigor of the folks who are out there defending it. I’ll have more on that soon.
One thing about the academic response to this — it doesn’t surprise me at all, given the hostile, adversarial attitude many professors have toward their own students. That attitude is a major problem on the American campus today, and it’s something that gets far too little attention.