You are currently browsing the category archive for the ‘Alcohol’ category.

Last week the Cal State Northridge Daily Sundial ran an article on student drinking habits that claimed that American first-year students “spend more time drinking than studying.” Their source for this claim was a deeply flawed report produced by a company that markets anti-alcohol programs to college campuses.

As we reported last month, the study in question was little more than a marketing handout for Outside the Classroom, a for-profit company that produces anti-drinking programming for use by student affairs administrators.

The study received quite a lot of attention on its release, in large part because it was presented at the annual meeting of NASPA, a professional association for professionals in the student affairs field. What received much less attention was the fact that Outside the Classroom is a major corporate sponsor of NASPA, and paid for time at the group’s annual meeting.

And the problems with the study don’t end with its sponsorship. Its methodology is questionable and its most often repeated conclusions are not supported by the evidence it offers.

In short, the Outside the Classroom “study” is shoddy, anti-student research from a company with a financial interest in portraying students as problem drinkers. Disseminating it doesn’t bring us any closer to actually understanding student drinking habits, healthy or unhealthy.

The students of the University of Arkansas voted this week to urge the U of A to adopt “sanctions for the possession and use of marijuana … no greater than those imposed by the University for the possession and use of alcohol.”

The university’s dean of students inserted himself into the referendum debate a week ago, sending a mass email to students arguing that “individuals choosing to possess and/or use marijuana merit different educational sanctions from those who violate the alcohol policy.” All of the candidates running for student government president and vice president endorsed the referendum, however, and it passed by a two-to-one margin.

Update: These referenda have been put on the ballot by NORML chapters at a bunch of campuses — one passed at Purdue just last week.

Students at Anderson University in Indiana aren’t allowed to drink. Not even off campus. Not even if they’re twenty-one. Not even if they’re twenty-one and off campus.

So the day before yesterday a few of them staged an act of civil disobedience.

About twenty-five students left morning chapel services on Tuesday and walked as a group to Kroakerheads, a bar about a mile from campus. (They arrived there at about 10:30, half an hour before Kroakerheads usually opens, but they’d called ahead and asked the staff to open early.)

They entered the bar. Some ordered beers, some ordered sodas, some didn’t order anything. All were in violation of Anderson student regulations, however — the rules bar not just drinking, but also being in the presence of others who are drinking.

The protest was staged by a student group called Students for a Democratic AU. One protest organizer, Caleb Fletcher, said it was not merely about the alcohol policy, but also “how the student body, as part of the institution, has been left out of policy decisions and the decision-making process.”

According to the Anderson student handbook, disciplinary sanctions for first-offenses relations to drinking include probation, medical evaluation, notification of parents, and “educational assignment/follow-up treatment.” Sanctions for second offenses include all of the above plus a fine and loss of privileges, with suspension or expulsion for third offenses.

An Anderson security employee observed and photographed the protest, and a university spokesman told the Associated Press that the university would follow its standard disciplinary process in dealing with the students who participated.

Anderson’s student government held a forum on the alcohol policies last night, and about two hundred of the university’s 2700 students attended. At the forum, Anderson’s president, James Edwards, defended the regulations, noting that they have been in place since the university opened in 1917.

Edwards did open the door a crack to a relaxing of the rules, saying that there may eventually be changes regarding “how the community and our expectations are enforced.” Others noted that other restrictions on social activities At Anderson have recently been lifted, including bans on playing cards and holding hands.

The university’s ban on dancing was lifted in 2007.

An interesting article from the Kansas City Star on what colleges tell (and don’t tell) families about students’ underage drinking violations.

The Family Educational Rights and Privacy Act (FERPA) limits what universities can do with information about students, but it gives campuses broad discretion in some areas. The Star explores the question of what universities do, and should, tell students’ families when a student violates drinking rules.

Wednesday’s edition of USA Today reported the results of a study released by NASPA, a national association of student personnel administrators, that claimed to show that — in the paper’s words — “College Freshmen Study Booze More Than Books.”

It’s a story with understandable appeal, one that has garnered attention on radiotelevision, and manymanymanymanymanymanymany blogs.

Despite its academic bona fides, however, the study was conducted by Outside the Classroom (OTC), a for-profit company that offers online alcohol education programs to colleges, and its methods and conclusions don’t stand up to close scrutiny. This isn’t serious academic work, it’s a thinly-disguised ad for Outside the Classroom’s products.

Yesterday we looked at OTC and NASPA to answer the question of who produced the report and why, and today we’re going to take a close look at some of the flaws in the study itself.

Fourteen of them, to be exact.

1. The study isn’t scholarly research.

The documentation Outside the Classroom has released on this study consists of a two-page press release, a one-page writeup of methodology and findings, and an anemic page and a half of notes and references. It makes no attempt to establish statistical significance for its findings, and provides no defense of its methodology, no serious literature review, no text of the questions asked. This is a marketing device, not an academic document.

2. Its title misrepresents its conclusions.

The title of the report is “College Students Spend More Time Drinking Than Studying,” but the researchers found that only 34% of students surveyed did so, and concluded that in an average week the average student spends more time studying than drinking.

The group of students who, the researchers found, spent more total time drinking than studying consisted of those first-years who reported having had at least one drink in the past two weeks. And even among those students, it found that a majority spent more time studying than drinking.

3. A questionable definition of “students who used alcohol.”

The OTC survey asked students whether they had consumed any alcohol in the last year, and then asked those students to specify how many drinks they had consumed each day in the last two weeks. Students who said they hadn’t drunk in the previous year were thus excluded from the study at the outset.

When compiling their statistics, however, OTC defined “drinkers” as only those students who had consumed alcohol in the previous two weeks. Occasional drinkers — students who had consumed alcohol in the past year, but not in the past two weeks — were dropped from the study as non-drinkers, and thus excluded from averages of how much time students spend drinking. 

4. Overheated descriptions of time spent “drinking.”

The report refers to time spent drinking as time spent “downing alcohol,” “consuming alcohol,” and so on. If a student spends an hour and a half having dinner with friends and has two glasses of wine along the way, can we really characterize that as 90 minutes spent “downing alcohol”?

5. Inexplicably small sample size.

Outside the Classroom used students who took an online survey through its AlcoholEdu program as their research subjects. OTC’s website claims that more than half a million students on more than five hundred American campuses use AlcoholEdu every year, but the study incorporates a sample of just 30,183 students.

That’s just six percent of the total, and less than a quarter of the first-years one would expect to find in the pool. What happened to the rest of the data?

6. Questionable sample selection.

The title of the report describes it as a study of “college students,” and a quote from OTC’s founder, Brandon Busteed, claims it demonstrates that many students are “drinking their way through college.” So why is the sample limited to students in the first semester of freshman year?

Are the drinking habits of first-semester students representative of those of the student body as a whole? If not, why portray them that way? What motivated the decision to exclude all students other than first-semester first-years from the study?

7. Questionable methodology for computing time spent drinking.

Rather than asking students how much time they had spent drinking over the past two weeks, OTC chose to ask each of them about the length of just one of their drinking sessions, and calculate totals on that basis. This is a defensible approach, as students may be more able to accurately estimate the length of a single event than a cumulative figure. But the way OTC performed their calculations is really really weird.

Here’s how they did it:

They asked each student how many drinks he or she had consumed in each of the previous fourteen days. Then they selected the day on which the student had consumed the most drinks, and asked how long that drinking session had lasted. Then they averaged all those results from all students for each number of drinks, compiling an average length of a one-drink drinking session, a two-drink session, and so on. Then they applied those numbers to each student’s report of each of his or her drinking sessions. 

If a student reported having one drink on one day, two drinks on another, and five drinks on a third, in other words, OTC estimated the length of the one-drink session on the basis of an average of the lengths of all the sessions in which students reported that one drink was the most they had consumed in a single day.

There is no explanation in the report as to why they collected the data in such an odd way.

8. Questionable methodology for computing time spent studying.

In 2007, OTC asked its survey respondents how much time they spent studying in the previous two weeks. But instead of asking that question again in 2008, or using the 2007 survey data for their study, or combining 2007 studying data with 2008 drinking data, they did something truly bizarre. They amalgamated survey results from an independent study, The American Freshman: National Norms for 2007, with their own 2007 data and numbers from an unpublished 2008 dissertation “using a mean estimation procedure” that they do not describe in their report.

They provide no rationale for this decision, which has the effect of rendering their data and methods completely opaque.

9. Questionable methodology for computing time spent on other pursuits.

The study’s estimates of time students spent working for pay and doing “online networking or playing video games” are likewise aggregated from multiple sources with no explicit methodology. Estimates of time spent “social networking” are taken from the unpublished dissertation mentioned above, and estimates of time spent exercising are taken from a journal article that studied the exercise habits of international students at five midwestern universities.

10. An unsubstantiated claim of primacy of drinking over other activities.

The report declares flatly that “no other activity occupies nearly as much” of a first-year student’s time “as drinking.” But as noted above, the study did not attempt to measure such activities as classroom attendance, socializing, commuting, television watching, or participation in clubs and organizations. 

11. An eyebrow-raising estimate of time spent online.

The report claims, on the basis of an unpublished 2008 dissertation, that the average American first-year spent just 2.5 hours a week on online social networking. Given that a major study conducted way back in 2002 — before MySpace, FaceBook, Twitter, or the blog explosion — found  that three-quarters of college students spent four or more hours online each week, we’re a little skeptical of that claim.

12. Unwarranted conclusions as to the difficulty of the college curriculum.

CTO founder Busteed is quoted as saying that the report “calls into question whether faculty are demanding enough hard work from their students.” As noted in point 7 above, however, the study presents no original research regarding the amount of time students spend studying.

13. Unwarranted extrapolation of “snapshot” drinking data to the entire first year.

The two-week snapshot of students’ drinking habits was obtained in a survey conducted during the fall semester. The report does not specify when during the semester the survey was conducted, or whether its timing was at the discretion of students or administrators. Either way, it seems highly likely that the survey would have taken place during a slow period during the semester, when students would be most likely to have time to devote to the project.

Does it really make sense to estimate an entire year’s drinking behavior on the basis of figures gathered in the academic doldrums of September or October? Does OTC have reason to believe that students drink as much during the end-of-semester push as they do in weeks when they have time to fritter away on alcohol surveys?

14. Apples-to-oranges comparison of time spent drinking to other activities.

In estimating time spent drinking, as noted above, the study considers only those students who had consumed a drink in the previous two weeks. But in calculating time spent at work, exercising, and online, the study uses an average of all students, whether they engaged in those activities or not. This produces such results as the claim that the average student spent just 2.2 hours a week in paid work.

About This Blog

n7772graysmall is the work of Angus Johnston, a historian and advocate of American student organizing.

To contact Angus, click here. For more about him, check out

Twitter Updates

%d bloggers like this: