Wednesday’s edition of USA Today reported the results of a study released by NASPA, a national association of student personnel administrators, that claimed to show that — in the paper’s words — “College Freshmen Study Booze More Than Books.”

It’s a story with understandable appeal, one that has garnered attention on radiotelevision, and manymanymanymanymanymanymany blogs.

Despite its academic bona fides, however, the study was conducted by Outside the Classroom (OTC), a for-profit company that offers online alcohol education programs to colleges, and its methods and conclusions don’t stand up to close scrutiny. This isn’t serious academic work, it’s a thinly-disguised ad for Outside the Classroom’s products.

Yesterday we looked at OTC and NASPA to answer the question of who produced the report and why, and today we’re going to take a close look at some of the flaws in the study itself.

Fourteen of them, to be exact.

1. The study isn’t scholarly research.

The documentation Outside the Classroom has released on this study consists of a two-page press release, a one-page writeup of methodology and findings, and an anemic page and a half of notes and references. It makes no attempt to establish statistical significance for its findings, and provides no defense of its methodology, no serious literature review, no text of the questions asked. This is a marketing device, not an academic document.

2. Its title misrepresents its conclusions.

The title of the report is “College Students Spend More Time Drinking Than Studying,” but the researchers found that only 34% of students surveyed did so, and concluded that in an average week the average student spends more time studying than drinking.

The group of students who, the researchers found, spent more total time drinking than studying consisted of those first-years who reported having had at least one drink in the past two weeks. And even among those students, it found that a majority spent more time studying than drinking.

3. A questionable definition of “students who used alcohol.”

The OTC survey asked students whether they had consumed any alcohol in the last year, and then asked those students to specify how many drinks they had consumed each day in the last two weeks. Students who said they hadn’t drunk in the previous year were thus excluded from the study at the outset.

When compiling their statistics, however, OTC defined “drinkers” as only those students who had consumed alcohol in the previous two weeks. Occasional drinkers — students who had consumed alcohol in the past year, but not in the past two weeks — were dropped from the study as non-drinkers, and thus excluded from averages of how much time students spend drinking. 

4. Overheated descriptions of time spent “drinking.”

The report refers to time spent drinking as time spent “downing alcohol,” “consuming alcohol,” and so on. If a student spends an hour and a half having dinner with friends and has two glasses of wine along the way, can we really characterize that as 90 minutes spent “downing alcohol”?

5. Inexplicably small sample size.

Outside the Classroom used students who took an online survey through its AlcoholEdu program as their research subjects. OTC’s website claims that more than half a million students on more than five hundred American campuses use AlcoholEdu every year, but the study incorporates a sample of just 30,183 students.

That’s just six percent of the total, and less than a quarter of the first-years one would expect to find in the pool. What happened to the rest of the data?

6. Questionable sample selection.

The title of the report describes it as a study of “college students,” and a quote from OTC’s founder, Brandon Busteed, claims it demonstrates that many students are “drinking their way through college.” So why is the sample limited to students in the first semester of freshman year?

Are the drinking habits of first-semester students representative of those of the student body as a whole? If not, why portray them that way? What motivated the decision to exclude all students other than first-semester first-years from the study?

7. Questionable methodology for computing time spent drinking.

Rather than asking students how much time they had spent drinking over the past two weeks, OTC chose to ask each of them about the length of just one of their drinking sessions, and calculate totals on that basis. This is a defensible approach, as students may be more able to accurately estimate the length of a single event than a cumulative figure. But the way OTC performed their calculations is really really weird.

Here’s how they did it:

They asked each student how many drinks he or she had consumed in each of the previous fourteen days. Then they selected the day on which the student had consumed the most drinks, and asked how long that drinking session had lasted. Then they averaged all those results from all students for each number of drinks, compiling an average length of a one-drink drinking session, a two-drink session, and so on. Then they applied those numbers to each student’s report of each of his or her drinking sessions. 

If a student reported having one drink on one day, two drinks on another, and five drinks on a third, in other words, OTC estimated the length of the one-drink session on the basis of an average of the lengths of all the sessions in which students reported that one drink was the most they had consumed in a single day.

There is no explanation in the report as to why they collected the data in such an odd way.

8. Questionable methodology for computing time spent studying.

In 2007, OTC asked its survey respondents how much time they spent studying in the previous two weeks. But instead of asking that question again in 2008, or using the 2007 survey data for their study, or combining 2007 studying data with 2008 drinking data, they did something truly bizarre. They amalgamated survey results from an independent study, The American Freshman: National Norms for 2007, with their own 2007 data and numbers from an unpublished 2008 dissertation “using a mean estimation procedure” that they do not describe in their report.

They provide no rationale for this decision, which has the effect of rendering their data and methods completely opaque.

9. Questionable methodology for computing time spent on other pursuits.

The study’s estimates of time students spent working for pay and doing “online networking or playing video games” are likewise aggregated from multiple sources with no explicit methodology. Estimates of time spent “social networking” are taken from the unpublished dissertation mentioned above, and estimates of time spent exercising are taken from a journal article that studied the exercise habits of international students at five midwestern universities.

10. An unsubstantiated claim of primacy of drinking over other activities.

The report declares flatly that “no other activity occupies nearly as much” of a first-year student’s time “as drinking.” But as noted above, the study did not attempt to measure such activities as classroom attendance, socializing, commuting, television watching, or participation in clubs and organizations. 

11. An eyebrow-raising estimate of time spent online.

The report claims, on the basis of an unpublished 2008 dissertation, that the average American first-year spent just 2.5 hours a week on online social networking. Given that a major study conducted way back in 2002 — before MySpace, FaceBook, Twitter, or the blog explosion — found  that three-quarters of college students spent four or more hours online each week, we’re a little skeptical of that claim.

12. Unwarranted conclusions as to the difficulty of the college curriculum.

CTO founder Busteed is quoted as saying that the report “calls into question whether faculty are demanding enough hard work from their students.” As noted in point 7 above, however, the study presents no original research regarding the amount of time students spend studying.

13. Unwarranted extrapolation of “snapshot” drinking data to the entire first year.

The two-week snapshot of students’ drinking habits was obtained in a survey conducted during the fall semester. The report does not specify when during the semester the survey was conducted, or whether its timing was at the discretion of students or administrators. Either way, it seems highly likely that the survey would have taken place during a slow period during the semester, when students would be most likely to have time to devote to the project.

Does it really make sense to estimate an entire year’s drinking behavior on the basis of figures gathered in the academic doldrums of September or October? Does OTC have reason to believe that students drink as much during the end-of-semester push as they do in weeks when they have time to fritter away on alcohol surveys?

14. Apples-to-oranges comparison of time spent drinking to other activities.

In estimating time spent drinking, as noted above, the study considers only those students who had consumed a drink in the previous two weeks. But in calculating time spent at work, exercising, and online, the study uses an average of all students, whether they engaged in those activities or not. This produces such results as the claim that the average student spent just 2.2 hours a week in paid work.