You are currently browsing the category archive for the ‘Student Bashing’ category.

A big victory for students’ rights: a federal judge has blocked a Pennsylvania prosecutor’s plans to file child pornography charges against three teenage girls who stored suggestive photos of themselves on their cell phones. 

Two of the three were wearing opaque bras in the photographs at issue, and the third was topless. None was engaging in sexual activity. The three were among twenty students in Pennsylvania’s Tunkhannock School District who were contacted by the prosecutor after school officials confiscated their cell phones, searched them, and found nude or revealing photos on them.

The prosecutor told the twenty students that they had a choice — they could sign up for an ongoing educational program on “what it means to be a girl in today’s society” and mandatory drug tests, or they could be charged with possession and distribution of child pornography, a felony.

Seventeen of the students signed up for the program. The other three sued. And yesterday a federal judge took their side.

The prosecutor, reached for comment yesterday, refused to say whether he would appeal the judge’s decision.

Here’s another great resource — the National Coalition Against Censorship.

We’ve linked to their blog in our sidebar, but feel free to poke around their main site, too.They’ve got lots of stuff going on, including various projects run through their Youth Free Expression Network.

Kristen Juras, the University of Montana law professor who has been campaigning to force the UM Kaimin to dump its sex advice column, appeared at a campus forum with the Kaimin‘s editor last night.

Juras called student activity fee support for the paper “government” funding, and described that funding as “a privilege.” She has in the past threatened to intervene with the university’s trustees or even the Montana state legislature to attempt to get that privilege withdrawn.

At last night’s forum, Juras said that any Kaimin sex column should be written by a “sexologist,” though she acknowledged, when pressed, that other student columns — such as those on religion — do not require such “expertise.”

Kaimin editor Bill Oram defended the column’s lackadaisical tone. “We’ll stop talking casually about sex when students stop having sex casually,” he said. “We’ll stop talking about sex in a fun way when sex stops being fun.”

Juras took a less lighthearted stance. “I’m not opposed to sex,” she said at one point. “I’m happily married and it’s an important part of our relationship.”

We noted last week that University of Montana law prof Kristen Juras had called for censorship of the U of M student newspaper, saying that its sex advice column “affects my reputation as a member of the faculty.”

She was almost right. The sex advice column wasn’t having any effect on her reputation. Dozens of campus papers have such columns, and nobody holds tax law professors responsible for the content of a school’s student newspaper anyway. If she’d just tut-tutted to herself, her reputation would have been just fine.

But she didn’t, and it isn’t.

Juras’ name now appears in eight of the top ten Google hits for ” ‘University of Montana’ sex.” Most of the top hits for her name are references to this ugly story.

So Professor Juras needs help. And Patrick from Popehat (presently number four in a Google search on “Kristen Juras”) is willing to step in:

I’m gravely concerned about Professor Juras’s ignorance of First Amendment precedent such as Tinker v. Des Moines Independent Community School District, 393 U.S. 503, 89 S. Ct. 733, 21 L. Ed. 2d 731 (1969), which holds that speech by students in public schools may be infringed only on a showing that it will disrupt the orderly running of the school, or is indecent.  (Professor Juras does not make such a contention concerning Ms. Davis’s columns.)  I’m concerned that, to the extent that what Professor Juras really seeks is to have the University censor one student, she is asking for constitutionally prohibited viewpoint discrimination under the guise of sometimes permitted content discrimination.

Moreover, and this is what really concerns me, as far as Professor Juras’s reputation is concerned, I believe that any time someone writes, “I respect free speech, but…” and then goes on to ask for censorship, that person looks like an ass, a fool, and a hypocrite.

And so, in order to protect Kristen Juras’s reputation, I am asking to be appointed as an independent monitor at the University of Montana School of Law, with authority over the writings and speech of assistant professors who teach property, business, and tax, and a requirement that all such writings and speech be cleared with me, beforehand, to the extent that they touch on political or legal topics outside the subjects of property, business transactions, and tax.  (Because God, I don’t want to have to read that stuff.)

Since Kristen Juras, evidently, is unwilling to protect her own reputation, which is now that of a fool, someone else will have to do it.  For her own damned good.

He’s a giver, that Patrick.

Wednesday’s edition of USA Today reported the results of a study released by NASPA, a national association of student personnel administrators, that claimed to show that — in the paper’s words — “College Freshmen Study Booze More Than Books.”

It’s a story with understandable appeal, one that has garnered attention on radiotelevision, and manymanymanymanymanymanymany blogs.

Despite its academic bona fides, however, the study was conducted by Outside the Classroom (OTC), a for-profit company that offers online alcohol education programs to colleges, and its methods and conclusions don’t stand up to close scrutiny. This isn’t serious academic work, it’s a thinly-disguised ad for Outside the Classroom’s products.

Yesterday we looked at OTC and NASPA to answer the question of who produced the report and why, and today we’re going to take a close look at some of the flaws in the study itself.

Fourteen of them, to be exact.

1. The study isn’t scholarly research.

The documentation Outside the Classroom has released on this study consists of a two-page press release, a one-page writeup of methodology and findings, and an anemic page and a half of notes and references. It makes no attempt to establish statistical significance for its findings, and provides no defense of its methodology, no serious literature review, no text of the questions asked. This is a marketing device, not an academic document.

2. Its title misrepresents its conclusions.

The title of the report is “College Students Spend More Time Drinking Than Studying,” but the researchers found that only 34% of students surveyed did so, and concluded that in an average week the average student spends more time studying than drinking.

The group of students who, the researchers found, spent more total time drinking than studying consisted of those first-years who reported having had at least one drink in the past two weeks. And even among those students, it found that a majority spent more time studying than drinking.

3. A questionable definition of “students who used alcohol.”

The OTC survey asked students whether they had consumed any alcohol in the last year, and then asked those students to specify how many drinks they had consumed each day in the last two weeks. Students who said they hadn’t drunk in the previous year were thus excluded from the study at the outset.

When compiling their statistics, however, OTC defined “drinkers” as only those students who had consumed alcohol in the previous two weeks. Occasional drinkers — students who had consumed alcohol in the past year, but not in the past two weeks — were dropped from the study as non-drinkers, and thus excluded from averages of how much time students spend drinking. 

4. Overheated descriptions of time spent “drinking.”

The report refers to time spent drinking as time spent “downing alcohol,” “consuming alcohol,” and so on. If a student spends an hour and a half having dinner with friends and has two glasses of wine along the way, can we really characterize that as 90 minutes spent “downing alcohol”?

5. Inexplicably small sample size.

Outside the Classroom used students who took an online survey through its AlcoholEdu program as their research subjects. OTC’s website claims that more than half a million students on more than five hundred American campuses use AlcoholEdu every year, but the study incorporates a sample of just 30,183 students.

That’s just six percent of the total, and less than a quarter of the first-years one would expect to find in the pool. What happened to the rest of the data?

6. Questionable sample selection.

The title of the report describes it as a study of “college students,” and a quote from OTC’s founder, Brandon Busteed, claims it demonstrates that many students are “drinking their way through college.” So why is the sample limited to students in the first semester of freshman year?

Are the drinking habits of first-semester students representative of those of the student body as a whole? If not, why portray them that way? What motivated the decision to exclude all students other than first-semester first-years from the study?

7. Questionable methodology for computing time spent drinking.

Rather than asking students how much time they had spent drinking over the past two weeks, OTC chose to ask each of them about the length of just one of their drinking sessions, and calculate totals on that basis. This is a defensible approach, as students may be more able to accurately estimate the length of a single event than a cumulative figure. But the way OTC performed their calculations is really really weird.

Here’s how they did it:

They asked each student how many drinks he or she had consumed in each of the previous fourteen days. Then they selected the day on which the student had consumed the most drinks, and asked how long that drinking session had lasted. Then they averaged all those results from all students for each number of drinks, compiling an average length of a one-drink drinking session, a two-drink session, and so on. Then they applied those numbers to each student’s report of each of his or her drinking sessions. 

If a student reported having one drink on one day, two drinks on another, and five drinks on a third, in other words, OTC estimated the length of the one-drink session on the basis of an average of the lengths of all the sessions in which students reported that one drink was the most they had consumed in a single day.

There is no explanation in the report as to why they collected the data in such an odd way.

8. Questionable methodology for computing time spent studying.

In 2007, OTC asked its survey respondents how much time they spent studying in the previous two weeks. But instead of asking that question again in 2008, or using the 2007 survey data for their study, or combining 2007 studying data with 2008 drinking data, they did something truly bizarre. They amalgamated survey results from an independent study, The American Freshman: National Norms for 2007, with their own 2007 data and numbers from an unpublished 2008 dissertation “using a mean estimation procedure” that they do not describe in their report.

They provide no rationale for this decision, which has the effect of rendering their data and methods completely opaque.

9. Questionable methodology for computing time spent on other pursuits.

The study’s estimates of time students spent working for pay and doing “online networking or playing video games” are likewise aggregated from multiple sources with no explicit methodology. Estimates of time spent “social networking” are taken from the unpublished dissertation mentioned above, and estimates of time spent exercising are taken from a journal article that studied the exercise habits of international students at five midwestern universities.

10. An unsubstantiated claim of primacy of drinking over other activities.

The report declares flatly that “no other activity occupies nearly as much” of a first-year student’s time “as drinking.” But as noted above, the study did not attempt to measure such activities as classroom attendance, socializing, commuting, television watching, or participation in clubs and organizations. 

11. An eyebrow-raising estimate of time spent online.

The report claims, on the basis of an unpublished 2008 dissertation, that the average American first-year spent just 2.5 hours a week on online social networking. Given that a major study conducted way back in 2002 — before MySpace, FaceBook, Twitter, or the blog explosion — found  that three-quarters of college students spent four or more hours online each week, we’re a little skeptical of that claim.

12. Unwarranted conclusions as to the difficulty of the college curriculum.

CTO founder Busteed is quoted as saying that the report “calls into question whether faculty are demanding enough hard work from their students.” As noted in point 7 above, however, the study presents no original research regarding the amount of time students spend studying.

13. Unwarranted extrapolation of “snapshot” drinking data to the entire first year.

The two-week snapshot of students’ drinking habits was obtained in a survey conducted during the fall semester. The report does not specify when during the semester the survey was conducted, or whether its timing was at the discretion of students or administrators. Either way, it seems highly likely that the survey would have taken place during a slow period during the semester, when students would be most likely to have time to devote to the project.

Does it really make sense to estimate an entire year’s drinking behavior on the basis of figures gathered in the academic doldrums of September or October? Does OTC have reason to believe that students drink as much during the end-of-semester push as they do in weeks when they have time to fritter away on alcohol surveys?

14. Apples-to-oranges comparison of time spent drinking to other activities.

In estimating time spent drinking, as noted above, the study considers only those students who had consumed a drink in the previous two weeks. But in calculating time spent at work, exercising, and online, the study uses an average of all students, whether they engaged in those activities or not. This produces such results as the claim that the average student spent just 2.2 hours a week in paid work.

About This Blog

n7772graysmall
StudentActivism.net is the work of Angus Johnston, a historian and advocate of American student organizing.

To contact Angus, click here. For more about him, check out AngusJohnston.com.