There’s a new book out called Academically Adrift: Limited Learning on College Campuses that got a lot of ink last week. It claims that only 55% of American college students improve their scores on a standardized test of critical thinking in their first two years of college, and that only 64% of students improve on that test during their entire time as undergrads.
I haven’t read the book yet, so I can’t speak in too much detail about its contents, but a few things leap out from the coverage.
First, there’s the fact of what the study doesn’t measure. Because it’s based solely on performance on a generalized test, it tells us nothing about what students have learned in their own fields of study, a fact that many news stories on the book have failed to mention, or buried. (That misrepresentation began with the study’s authors — Richard Arum and Josipa Roksa — who titled a Chronicle essay on their work “Are Undergraduates Actually Learning Anything?” — a question which even their own work would lead them to answer with “Yes, definitely. At least 64% of them, and probably a lot more.”)
The study’s value is also, obviously, dependent on the value of the test itself. Very little of the book’s press coverage has explored the question of what, exactly, the test measures, and what counts as improvement under the authors’ interpretation of the results.
A third issue is what pollsters call “the internals” of the study — the breakdown of how the results differ across communities. Are the least-prepared students learning the most, or the least? Which kinds of campuses, which kinds of students, which majors, are most successful? These kinds of questions are essential to making sense of the study’s findings, and with the exception of some meager ethnic data, they’re absent from the coverage I’ve seen.
It’s also important to note that the study fails to situate its findings in a historical context. The test the authors rely on was first used in 2004, so it tells us nothing about whether today’s students are learning more than previous generations, less, or about the same amount. This is particularly important given the incredibly widespread (though largely ungrounded) belief in college students’ intellectual and moral decline: Anytime anyone says “today’s college students suck,” a lot of listeners are going to hear “…compared to those who came before.”
There are other problems with the study. More than half of the students who took the test as first-years, for instance, weren’t tracked down for the follow-up. The authors’ methodology in assessing how much time students spend studying has been criticized as well. But my point here isn’t so much to criticize the study — which, again, I haven’t read — as it is to point out the perils of relying too uncritically on its representation in the media.