You are currently browsing the category archive for the ‘Students’ category.
Students protesting planned tuition hike at the University of Guam.
Wisconsin’s Paul Soglin — sixties student activist turned hippie politico turned three-time Madison mayor — has had it with kids today. Get off his lawn!
Georgia students protest crackdown on undocumented enrollees.
Israeli students boycott nation’s largest dairy … over high prices.
SC student government prez survives impeachment vote after admitting authorship of asinine anonymous tweets.
Student government veep who ran on anti-drunk driving platform resigns after DWI arrest.
In CEO’s report on racial disparities in UW admissions, they highlight an extremely misleading statistical concept — that of “odds ratios” — to leave the false impression that black and Latino applicants to UW are hundreds of times more likely to win acceptance than whites. They also dump more than a thousand students of color out of their applicant sample, inflating admissions percentages for blacks and Latinos by excluding weak and unqualified applicants from that pool and distorting statistics on Asians by excluding all applicants of Southeast Asian origin from their study.
In addition to all that, they engage in a variety of petty manipulations of data, as when they scale their admissions rates chart to begin at 50% rather than 0%, thus dramatically enhancing the visual impact of the graph at the expense of accuracy and readability.
Strangely missing in all this statistical sleight-of-hand is any straightforward statement of the magnitude of the supposed advantage that black and Latino applicants have over whites. At no point in the report do they compare — for instance — the chances of admission of two students, each at the midpoint of the applicant pool, one white, one black. (Neither do they directly compare the chances of admissions of students by criteria other than race under which white applicants have a structural advantage — those of legacy admits vs. non-legacies, for instance.)
At one point they inch toward such a comparison, with a chart listing the number of students of various races rejected with SATs or ACT scores and class rank higher than the median black admittee, but since that chart fails to list how many students in that category were accepted from each race, it’s impossible to translate the chart into actual comparative data.
In fact, there is only one section of their report in which they offer a direct comparison of the chances of admission of two groups of students, and it’s a comparison whose terms have been cherry-picked to provide the impression that they are hoping to leave.
In the report’s section on “Probabilities of Admission” they provide a chart comparing the chances of admission for groups of white, black, Latino, and Asian students — one chart each for in-state and out-of-state applicants. So far so good.
But each chart compares only a small sliver of the actual applicant pool. Beyond the exclusions I mentioned in previous posts, these charts leave out female applicants, who represent well over half of total applicants. They leave out the substantial fraction who took the SAT rather than the ACT. They leave out all legacies, a mostly white group with significant advantages in the admissions process. And as in the previous chart they set the bar for comparison at the median ACT score for black admittees.
There’s a basic principle in statistics that the farther away from the middle you get, the weirder your numbers are going to turn out. If you compare the chances of two students near the middle of the pack, you’re going to get stats on their odds of admission that reflect the fact that they’re similarly situated. But if you go looking for outliers, things start to get wacky.
To understand how this works, let’s do a thought experiment. Imagine that only one student whose first and last names both begin with the letter Z was admitted to Wisconsin in a particular year, and that this student happened, by chance, to have the second-worst grades and test scores of the entire entering class. Of all those students whose numbers were worse, only one was admitted, while 2000 were turned down. And among those 2000, by coincidence, there was a second student with a ZZ name.
Among ZZ-named students with grades and test scores as bad as or worse than our admittee, then, one out of two was admitted, giving that group odds of admission of one in two, or 50%. Among non-ZZ students with similar grades and test scores, only one in 2000 was admitted, giving admission odds of 0.05%. ZZ-named students at that grade/score level, in other words, were one thousand times more likely to be admitted than non-ZZs.
And what does this tell us? Pretty much nothing. If that ZZ student happened to be 100th from the bottom rather than second, the exact same formula would show that ZZs had odds twenty times better than non-ZZs, instead of a thousand times better. One-hundredth from the bottom and second are damn near identical in terms of actual numbers, but we’re so far out on the statistical distribution tail that even a slight change in real-world data produces huge swings in the reported odds.
The folks at CEO understand this. They understand that because the vast majority of UW’s applicants are white, and because black applicants tend to have somewhat lower test scores, choosing the black admittees’ median as your starting point will produce more dramatic contrasts than using the median of all applicants. They also understand that the smaller you make the pool, the more random variation you get. And so they made the pool small and unrepresentative.
To be clear, I don’t know what the numbers would look like if CEO were to crunch the data in a useful way. I don’t know how many times more likely to gain admission a black or Latino applicant with an application at the middle of the total pool would be than a white student with identical numbers. I suspect that such a student would have a considerable advantage.
But here’s the thing. CEO does know the answer to this question. They do have the data. They know what admissions rates look like if you compare students of different races from the middle of the pack, just as they know what the plain-language version of their misleading “odds ratio” claim would be.
They know all this stuff. They’re just choosing not to share.
Last night I posted an introduction to the Center for Equal Opportunity’s report on the use of race in assessing applicants to the University of Wisconsin at Madison, and attempted to untangle their preposterously misleading claim that black students have a 576-to-1 advantage in admissions. This morning I’ll be tackling CEO’s claims as to the rates at which student applicants of various racial groups are admitted to UW.
The CEO study’s author, Althea Nagai, claims on the report’s first page that in 2007 and 2008 “UW admitted more than 7 out of every 10 black applicants, and more than 8 out of 10 Hispanics, versus roughly 6 in 10 Asians and whites.” The university’s own public records for those years, however, show admissions rates of just 42% and 55% for black and Latino students, respectively, compared to a 55% rate of acceptance for white applicants and a 56% rate for Asians. According to UW, in other words, white, Asian, and Latino students were accepted at almost exactly the same rates in the period covered by the CEO report, while black students’ admission rates were considerably lower.
The CEO report does not attempt to explain this dramatic discrepancy. It does, however, provide a hint in a footnote, when it gives its raw numbers for total UW applicants and admissions in 2007 and 2008. By CEO’s reckoning, there were 38,476 applicants and 23,769 admittees to UW in those two years, while according to the university, 50,348 students applied for admission in 2007-08, of whom 27,415 were admitted.
CEO’s data set, in other words, is missing some 11,872 students, of whom only 3,646 were admitted. Nearly twelve thousand students — almost a quarter of the total applicant pool — were left out of CEO’s calculations. Why?
To start with, CEO’s sample set excludes “those cases for which race or ethnicity is listed as ‘Other,’ missing, or unknown,” as well as Southeast Asians and “American Indians and Native Hawaiians.” (A more appropriate approach would have been to combine these students into their own category, in order to preserve the integrity of the study’s data set, but we’ll let that pass.)
Aggregating UW’s figures for white, black, Latino, and “Other Asian” applicants for 2007 gives a total of 21,443, of whom 12,261 were admitted. CEO’s sample consists of 19,345 applicants, of whom 12,219 were admitted. That’s a further omission of 2,433 applicants, of whom 42 were admitted.
CEO doesn’t provide a breakdown by race of the data pool it used, so it’s impossible to say with absolute precision what numbers they used. But it is possible to reverse-engineer their dataset by applying the percentage totals they arrived at for each race to the aggregate numbers they provided. When you do that, this is what you get:
Of 854 black applicants, CEO’s sample included approximately 503. Of 769 Latino applicants, they included approximately 600. Of 2,038 non-Southeast Asians, they included approximately 1,528. And of 18,117 whites, they included approximately 16,714.
What does this mean? It means that about 10% of UW applicants who fit CEO’s racial parameters were left out of their study’s sample, with those exclusions coming disproportionately from student of color applicant pools. Only 8% of white applicants were left out of the CEO report, as compared to 10% of non-Southeast Asians, 22% of Latinos, and a staggering 41% of blacks.
So who were these students who were excluded from the sample?
CEO says that “cases with missing academic data were dropped from the statistical analyses,” as were applicants whose inclusion “might lead to the identification of an individual.” It’s clear that most of the applicants excluded from the CEO pool were students who were not admitted to UW, presumably because they submitted incomplete or otherwise unsatisfactory applications. But whatever the reasons for their rejections, they did apply, and CEO has simply erased them from the record as if they never existed.
Another erasure is the exclusion of Southeast Asians from the category “Asians,” noted above. Nearly a dozen times in the first page of the report, CEO compares statistics for Asians with those of other racial groups without noting once that their definition of “Asian” excludes more than fifteen percent of Asian applicants and admittees to the university. (CEO mentions the distinction between Southeast Asians and other Asians only twice in the report — both times in discussions of retention rates.)
CEO claims, as noted earlier, that UW’s admissions rates for black and Latino students are dramatically higher than those for whites and Asians. That claim rests on the exclusion of more than two thousand students of color from their applicant sample, an exclusion with major implications for CEO’s analysis.
Earlier this week a group called the Center for Equal Opportunity released a 21-page analysis of undergraduate admissions data from the University of Wisconsin at Madison, charging what they call “severe discrimination based on race and ethnicity.” Wisconsin students protested at a press conference announcing the findings, while one Republican state legislator is calling for a formal investigation of the university’s selection process.
Wisconsin is already a political tinderbox, of course, and this is likely to add fuel to the fire. It’s legal under binding Supreme Court precedent to consider race as a factor in college admissions, but CEO claims that UW has gone way overboard, admitting manifestly unqualified black and Latino students ahead of more-deserving whites.
I’ve spent a good chunk of the last two days examining the CEO study, and I’ve found that it’s riddled with serious flaws. UW admissions data don’t show what CEO’s report says they do, and the group’s most dramatic claims are its most poorly sourced. CEO is, to put it plainly, misrepresenting the Wisconsin admissions process in multiple serious ways.
At the top of both the CEO study of UW admissions and their press release touting it, the group makes a breathtaking claim about black and Latino students’ chances of admission to Wisconsin’s flagship campus. “The odds ratio favoring African Americans and Hispanics over whites” in UW Madison undergraduate admissions, they say, “was 576-to-1 and 504-to-1, respectively, using the SAT and class rank while controlling for other factors.”
576-to-1. Wow. Those are some pretty steep odds. But what does the claim actually mean?
Well, let’s start with what it doesn’t mean.
It doesn’t mean that the average black applicant to the University of Wisconsin has 576 times the chance of getting in than the average white applicant. Using CEO’s own numbers, the actual figure is about 1.2-to-1. (And as we’ll see in my next post, those numbers are highly problematic — UW’s own publicly available statistics show that black applicants actually have a significantly lower admission rate than whites.)
It also doesn’t mean that a black or Latino applicant to the University of Wisconsin with grades and test scores similar to the average UW applicant has a chance 576 or 504 times greater of winning admission than a white applicant with identical test scores.
The truth is that the CEO report doesn’t ever actually say what they intend to suggest by the 576/504 figures. The statistics’ meaning, they say, “may be difficult to grasp.” The pertinent equations, they say, “are complex and hard to explain.”
So if the meaning of an odds ratio is so obscure, why use it? Why make it the centerpiece of your media campaign?
It’s a good question. And it has a simple answer:
Because any more sensible way of constructing the question wouldn’t make UW’s black and Latino students look stupid.
The odds ratio is an arcane and obscure statistical concept. (I myself misstated it in the first version of this post, as a glance at the early comments shows.) Put as simply as possible, if P is the likelihood of one thing happening and Q is the likelihood of another thing happening, then P/Q is the way most of us would express the ratio of one thing happening versus the other. If P is 95% likely, and Q is 85% likely, then P/Q is 1.12, meaning that P is 1.12 times as likely to happen as Q. That’s what most of us think of when we think of odds, and it’s what most of us think of when we think of an odds ratio.
But it’s not what the term “odds ratio” means to a statistician.
To a statistician, the odds ratio of P to Q is represented by the following equation:
P(1-Q)/Q(1-P)
To put that in slightly plainer English, the odds ratio of P to Q is P multiplied by 1 minus Q divided by Q multiplied by 1 minus P. I am told that this is a useful concept for statisticians.
But however useful it may be for statisticians, it’s not useful for us laypeople, because it means something wholly different from what we expect it to mean. Let’s see what happens when we plug the numbers from my original example into this new formula.
(.95*.15)/(.85*.05) = 3.35
So the chances of P happening are 1.12 times greater than the chances of Q happening, but the odds ratio of P and Q is 3.35. And that gap isn’t consistent between samples — in some situations the two statistics are quite similar, while in others they’re very different. Change P to 99% while leaving Q at 85% and the relative chance of P inches up to 1.16 times the chance of Q while the odds ratio of P and Q soars to 17.47.
I want to underscore that. When P has a 99% chance of happening, and Q has an 85% chance of happening, the odds ratio of P to Q is 17.47. Obviously P isn’t seventeen times as likely to happen — P isn’t even anywhere near twice as likely to happen. (Twice as likely as 85% is 170%, and when you’re talking likelihoods, 170% is a meaningless concept.) So if I tell you that the odds ratio between P and Q is 17.47 to 1, and you’re not a statistician, you’re not going to be more informed than you were before. You’re going to be less informed. You’re going to be misinformed.
And that’s exactly what CEO is counting on.
Do a Google search for “odds ratio misleading” and you’ll find scholarly articles, journalists’ websites, statistical papers, all sorts of documents all saying the same thing — it’s scholarly malpractice to highlight odds ratios in materials intended for public consumption, because the risk of confusion is so high.
And it’s not only the public who gets confused. Look how Linda Chavez, the Chairman of CEO, summarized the group’s odds ratio findings in the Wisconsin Daily Cardinal this morning:
“The studies show that a black or Hispanic undergraduate applicant was more than 500 times likelier to be admitted to Wisconsin-Madison than a similarly qualified white or Asian applicant.”
See that? “More than 500 times likelier.” This isn’t true. It isn’t what CEO claims. An odds ratio is NOT an expression of the relative likelihood of two events. But here’s the head of CEO pretending otherwise in the student newspaper of the very university under discussion.
Ten years ago yesterday I was at the same place I was twenty years ago — on the Binghamton University campus in upstate New York. (In 1991 I was a student, in 2001 I was advising a statewide student organization.)
I woke up in Albany on the morning of September 11, and drove on empty highways to Binghamton for a scheduled meeting, listening to reports of the attacks on the radio. A few days later I wrote this summary of what I found when I arrived:
Binghamton was surprisingly subdued — much calmer than I’d seen it when the Gulf War started in January 1991. Lots more people have cable in their dorms now than did then, though, so I expect most of the students who were really worried were in their rooms by the phone.
In 1991, if you wanted to keep up with a breaking news story on a college campus, you usually had to go to the student union and gather around a communal television. In 2001 if you wanted to keep in touch with family you needed to stay in your dorm room.
Ten years ago, twenty years ago. No Facebook, no Twitter. Today you can sit on a couch in the union surrounded by dozens of your fellow students while you hear your parents’ voices from a hundred miles away and read what your friends are doing on their couches in their unions all over the country. All at the same time. You don’t have to choose between connecting with a global experience and your local community and your far-flung networks of loved ones. You used to have to choose, but you don’t anymore.
I wrote a few weeks ago about how impoverished the Beloit College “mindset list” is, how trivial and how silly. But it’s not just in matters of educational policy and campus politics that the list missed the mark. The American campus, and the American student experience, is changing in all sorts of ways, in ways it’s easy for both students and faculty to miss.
Technology doesn’t shatter community, it transforms it.

Recent Comments