At Oxford University in England, admissions criteria are clear. What matters is an applicant's potential to succeed in the subject she wants to study. A student wanting to study mathematics, say, must nail the math entrance exam, and in an interview show the potential to be an outstanding mathematician. Whether or not she is a concert violinist, the first in her family to go on to higher education, or the only female applicant in mathematics is irrelevant.
Oxford students believe in this system. They feel the university is not responsible for making British society equal. But that isn't how things have worked in the United States for nearly 100 years - and perhaps not ever.
The U.S. Supreme Court ruled Monday on a case involving college admissions that could lead to significant changes. The lawsuit was brought by Abigail Fisher, a white student denied admission to the University of Texas who claims that she experienced racial discrimination. The justices ruled that schools must show that there are no race-neutral alternatives that will produce the desired diversity on campus before they turn to affirmative-action programs.
But it's important to understand the broader context of U.S. admissions policies when considering affirmative action's future.
Although we would like to believe universities have always admitted students based on some objective definition of merit, in fact admissions criteria have changed dramatically over time. Until the 1920s, elite universities in the United States administered university exams to determine the "best" students. But even then, the exams had their biases - for example, they usually contained material such as Latin that was taught only in elite schools.
The exam system began to change after Ivy League schools became alarmed at the number of Jewish students applying and scoring well on the tests. At that point, elite schools began to shift their definitions of "merit." Columbia University led the way by introducing "character" as something to be considered in admissions decisions. This amorphous quality was said to include personality traits such as manliness and leadership ability. In order to judge character, the universities asked for photos and letters of recommendation and, in some cases, conducted interviews with applicants.
During the 1960s, this understanding of merit started to shift. No longer used to exclude Jews, it began being used to address the notable underrepresentation of black students on campus.
Today, selective universities in the United States consider a range of attributes beyond academics in making admissions decisions. These include an applicant's extracurricular activities, athletic prowess, hardships overcome, legacy status and race. In my research at elite American universities today, I have found that undergraduates are quite comfortable with these flexible notions of merit. They express a belief that racially diverse campuses are necessary to their training as future citizens and leaders in our globalized world. And they also support other examples of flexibility in admissions, including athletic recruiting and preferences for the children of alumni. Athletes, they argue, contribute to a fun campus life, while legacy admissions bring funds to the university that can potentially contribute to scholarships for more disadvantaged students.
University admissions policies in the United States have been highly subjective for decades. Race-based affirmative action is a part of the picture, and it symbolizes a commitment on the part of colleges and universities to the pursuit of racial justice in a country plagued by extreme racial inequality. It's hard to argue the programs are no longer necessary. Black Americans are still more than twice as likely to be poor as white Americans; black children are more likely to attend underperforming, segregated schools than white children; and whites with a criminal record are more likely to receive a callback on job applications than blacks with no criminal record.
More than 50 years ago, British sociologist Michael Young coined the term "meritocracy." He intended the term to have negative connotations, referring to a dystopia in which the elite use notions of merit to justify and maintain their status across generations. He portrayed a future in which promotion, pay and school admissions would be used to reward elites for their class-based cultural know-how rather than for qualities attainable by anyone in society.
If the Supreme Court ruling in the Fisher case bans the consideration of factors that promote racial equality and justice in admissions decisions but allows universities to continue considering other kinds of non-academic "merit" that increase inequality, we will be one step closer to the kind of dysfunctional "meritocracy" Young envisioned.
Natasha Kumar Warikoo is an assistant professor at Harvard Graduate School of Education and the author of "Balancing Acts: Youth Culture in the Global City." She wrote this for the Los Angeles Times.