Russell Nieli is a Senior Preceptor in Princeton University’s James Madison Program in American Ideals and Institutions, as well as a Lecturer in Princeton’s Politics Department; [email protected]. His most recent article for Academic Questions, “Prepare the Child for the Road, Not the Road for the Child,” reviewed Lukianoff and Haidt’s The Coddling of the American Mind and appeared in our spring, 2019 issue.
La carrier est ouvérte aux talents—the career open to the talents—was an ideal popularized by Napoleon in the early decades of the nineteenth century. It meant that positions in the expanding French civil service and military would be filled by those who were most qualified to carry out the tasks required of them. In the feudal society from which France was rapidly emerging, in which hereditary position and privileges of class were often the rule, the idea was revolutionary, and had great appeal to talented, ambitious young men—men like Napoleon himself—who sought to rise up in the world through ability and hard work. In the middle of the following century the idea was given a name by the British sociologist Michael Young—“meritocracy.” Unlike Napoleon, however, Young didn’t think a society that became stratified on the basis of talent and such innate traits as intelligence (as ascertained by formal testing) was much better than a fixed-status, feudal order stratified by the privileges of birth. Both ran counter to his socialist and communal ideals.[1]
But in France and other countries in Europe including Britain, the meritocratic ideal often gave birth to the use of high-stakes selection examinations, both for admission to advanced education and jobs in government service. A similar meritocratic system had produced the great cultural outpouring of the Tang Dynasty in China form the seventh to the tenth century of the Christian era. In the United States, however, developments in a meritocratic direction, at least in terms of government employment, were hampered by the growth of the so-called “spoils system,” whereby government jobs were handed to partisan political loyalists and members of the political party that had most recently won an election. This was true at both the national level and even more so at the big-city urban level, where patronage-based political machines often dominated the landscape well into the second half of the twentieth century.
Following the assassination of President Garfield by a failed office seeker, the U.S. Congress passed the Pendleton Act in 1883, which did much to tie the distribution of government jobs to those who had passed job-specific civil service exams. Ideologically at least, the principle of merit-based selection had clearly triumphed at the national level in the decades following the Civil War, even if legal implementation was often slow to catch up.
Feudal societies dominated by titled nobilities had fixed-status orders based on hereditary classes, while caste-based societies, like those in the Jim Crow South and Hindu-dominated India, had fixed-status orders based on hierarchies of race, clan, and tribe. In all such societies social status was generally fixed at birth, and considerable pressures were placed on individuals to “keep to their place” lest those who are either above or below them take offense. The “upstarts” suffered from the malicious envy and Evil Eye of those below them envious of their rise, while those born to castes or classes above them were contemptuous of such nouveau riche and parvenu interlopers. It was often a wise strategy for those who had come upon good fortune in such societies to conceal their change in wealth or status, and certainly not advertise or boast about it.[2]
While black slaves and members of the Native American tribes were not included in the idea, the belief in the desirability of upward mobility through talent and grit was memorialized from early on in America by Benjamin Franklin in his Autobiography and in Poor Richard’s Almanac. “Poor boy makes good” became an energizing ideal for countless American immigrants, and both Abraham Lincoln and Frederick Douglass became leading advocates for an ethic of merit-based achievement, which foreign visitors from Tocqueville’s time to the present have seen as an enduring feature of American culture. This ideal was reinforced in the later nineteenth and early twentieth century by the enormously popular McGuffey Readers and the novels for adolescent young men by Horatio Alger, in which poor but honest adolescents escape poverty through diligence and hard work and eventually rise to middle-class respectability.
Although not adopted as an American slogan, “careers open to talents” was a motto that comported well with the nineteenth century American ideal of the “self-made man.” The latter, as defined by Frederick Douglass in a famous essay by that title, was one in which any man (women would later be included), regardless of rank or station, could rise up in the world and improve his social and economic position through his own determination, intelligence, focus, and hard work. “Self-made men,” Douglass wrote,
are the men who, under peculiar difficulties and without the ordinary help of favoring circumstances, have strained knowledge, usefulness, power and position and have learned from themselves the best uses to which life can be put in this world . . . Such men as these, whether found in one position or another; whether in the college or in the factory; whether professors or plowmen; whether Caucasian or Indian; whether Anglo-Saxon or Anglo-African, are self-made men and are entitled to a certain measure of respect for their success and for providing to the world the grandest possibilities of human nature, of whatever variety of race or color . . . Every instance of such success is an example and a help to humanity.[3]
Writing as an ex-slave and the embodiment of the self-made-man, Douglass realized the great opportunities in America, at least outside the Deep South, even for those of African descent, to advance in the open-market economy of post-Civil War America. And a society that judges people on their accomplishments, Douglass believed, would be one in which less emphasis would be placed on race, ethnicity, or family pedigree. Thomas Sowell would echo Douglass’s insights more than a century later when he explained that “in American society, achievement is what ultimately brings respect, including self-respect.”[4] And in a similar vein, social scientist Peter Salins explained that “a society devoted to judging people mainly by their accomplishments is a society that, of necessity, places less stock on judging them by their [racial, ethnic, or religious] backgrounds.”[5] The more meritocratic a society, Salins held, the greater the potential for transcending tribal parochialism and acknowledging excellence wherever it is found.
While the ideal of “careers open to the talents” clashed with certain powerful forces pushing in the direction of ethnocentrism, xenophobia, nepotistic favoritism, and racial exclusion, great strides were made in counteracting these forces after the Second World War. The national unity the war evoked and the struggle against Japanese militarism and Nazi race theory buttressed the meritocratic ideal, and did so nowhere more than in professional sports. In what we might call “the Jackie Robinson moment,” the principle of “careers open to the talents” triumphed in professional baseball, known in the day as “America’s pastime.” Supported by strong leadership from Brooklyn Dodger owner Branch Rickey and National League baseball president Ford Frick,[6] in combination with the changes wrought in public opinion by German and Japanese chauvinism, the racial integration of baseball and other sports was the most successful example of such integration in American history.
And to this day professional sports remain the model of meritocratic excellence and one to which those of us who strongly oppose the non-meritocratic features of today’s college admissions processes continue to appeal. “Professional sports remain one of the few areas of our national life and culture relatively uncontaminated by the erosion of standards,” writes former Commentary editor Norman Podhoretz. “There is still a fairly clear notion of what constitutes excellence in sports,” he explains, “and there isn’t much argument about the nature of the criteria.” Podhoretz continues:
What you’ve got here is a kind of passion for a world in which the standards are clear, excellence is relatively uncontroversial as a judgment of performance, and in which the major preoccupation is to do something supremely well, in which everyone has agreed that this is the major objective and in which the better the person or team does something, the more honor and the more and greater riches are likely to accrue to him. We want that kind of world, we see it in less and less pure form in other areas of our national life. The need and the hunger for such a world is what accounts for the passion that so many people, including me, feel about professional sports.[7]
Podhoretz himself attended Columbia as restrictions on the number of Jews admitted to Ivy League schools began to ease. But pure meritocracy always reigned at the City College of New York, whose admissions standards were based entirely on how well city residents had performed in the city’s high schools. Not surprisingly, in its heyday, before its academic standing was devastated by a change to a non-competitive, open enrolment system in the late 1960s, no less than nine Nobel prize winners had been graduates of CCNY—all of them Jews.
Unlike CCNY, the Ivy League institutions imposed informal Jewish ceiling quotas when a huge surge of outstandingly well-qualified Jewish applicants in the 1920s caused a backlash among their WASP alumni. The attitude among many of the previous WASP graduates is probably well-represented by the following complaint of a turn-of-the-century Harvard graduate responding to a visit to his old alma mater in the mid-1920s for the twenty-fifth reunion of his Harvard class:
Naturally, after twenty-five years, one expects to find many changes [on the Harvard campus]. But to find that one’s University had become so Hebrewized was a fearful shock. There were Jews to the right of me, Jews to the left of me, in fact they were so obviously everywhere that instead of leaving the [Harvard campus] with pleasant memories of the past, I left with a feeling of utter disgust of the present and grave doubts about the future of my Alma Mater.[8]
The powers that be at Harvard responded very quickly to such complaints and made changes to the criteria for admissions covertly designed to keep the proportion of incoming Jews below what it had risen to by the mid-1920s. Legacy preferences, special geographic preferences for those living outside the Northeast (where few Jews lived), preferences for athletes, and a new emphasis on “character,” dramatically reduced the number of Jews at Harvard, thus enabling it to retain its dominant WASP character and its alumni financial support.[9] Similar and even more overt ceiling quotas were imposed in several medical schools even outside the Ivy League where potential Jewish dominance was seen as an even greater threat. At Boston University medical school, for instance, where high-achieving Jews constituted almost half of the entering class in the 1929-1930 academic year, Jewish enrollment was reduced to only an eighth (12.5 percent) of the entering class in the 1934-1935 academic year. In the 1930s, the blunt and caustic dean of Yale’s medical school didn’t beat around the bush to limit members of less desirable racial and ethnic groups when instructing his admissions office: “Never admit more than five Jews, take only two Italian Catholics, and take no blacks at all.”[10]
Cornell medical school, located in the state with by far the largest number of Jewish residents, followed a similar path. Of its eighty entering slots, Jews were restricted to just ten entering students despite the fact that more than half of the applicants were Jews, many with outstanding qualifications. One in seven non-Jewish applicants were accepted versus only one in seventy Jews. As sociologist Nathan Glazer explains, “boys seeking entry to medical schools took as a fact of life that bright Jews would be rejected in favor of much less bright non-Jews,” and this was true “even when both were undergraduates at Cornell and knew perfectly well how one another stood in class.”[11]
The Ivies had been ignoring the principle of academic merit that had a venerable pedigree in America going as far back as Thomas Jefferson’s proposal for state-supported higher education in Virginia. They were, rather, continuing in a European tradition, one started in the nineteenth century in Czarist Russia and other parts of Eastern Europe, of numerically restricting the proportion of high-achieving Jews accepted to state institutions of higher learning. Most Jews in America were aware of the numerus clausus restrictions imposed on their forbears in Russia and dreaded the importation of the practice into America. But while the ideal of meritocratic selection took a severe hit in relation to the Jews in the 1920s, in the period immediately after the Second World War meritocratic admissions in American colleges and universities mostly followed the Jackie Robinson pattern. Selection to America’s top colleges and universities became more open, more competitive, and more clearly based on academic talent rather than WASP pedigree or prestigious prep-school background.
In 1952 the average freshman at Harvard College had SAT scores in the high 500s, not much above the national average. Just eight years later, buoyed by rising numbers of applicants from around the country, the average SAT score of entering freshmen soared to 678 on the verbal and 695 on the math, more than one-and-a-half standard deviations above the national average. As the authors of The Bell Curve explain, “the average Harvard freshman in 1952 would have placed in the bottom 10 percent of the incoming class by 1960.” Indeed, Harvard “was transformed . . . into a school populated by the brightest of the bright, drawn from all over the country.”[12] And the same was mostly true for the other Ivy League institutions. Meritocracy was on the march.
The 1950s and 60s also saw increased numbers of African Americans admitted to Ivy League institutions (including such later Harvard superstars as Thomas Sowell). And by the late 1960s all the Ivy League institutions and most of the elite private colleges had opened their doors to women. The merit-only ideal reached its apogee in an Executive Order by President Lyndon Johnson in 1967 outlawing discrimination in federal contracting on the basis of racial, ethnic, religious, gender, or national-origins categories: “It is the policy of the United States Government to provide equal opportunity in federal employment and in employment by federal contractors on the basis of merit and without discrimination because of race, color, religion, sex or national origin.”[13]
It was one of the grim ironies of the late 1960s, however, that it was also in this period, that the “color blind” ideal was turned on its head. “Without discrimination” came to be interpreted as “with discrimination” and an endless variety of programs in employment, university admissions, and government hiring introduced racial, ethnic, and gender quotas that guaranteed merit would be jettisoned in favor of goals that placed increasing importance on group-identities and other non-meritocratic criteria.
As Nathan Glazer explained, with the passage of the 1964 Civil Rights Act, the nation declared that “no account should be taken of race, color, national origin, or religion in the spheres of voting, jobs, and education—in 1968 we added housing.” Yet no sooner had we made this national assertion, Glazer continues,
then we entered into an unexampled enterprise of recording the color, race, and national origin of every individual in every significant sphere of his life. Larger and larger areas of employment came under increasingly stringent controls so that each offer of a job, each promotion, each dismissal had to be considered in the light of its effects on group ratios in employment. Inevitably, this meant the ethnic group of each individual began to affect and, in many cases, to dominate consideration of whether that individual would be hired, promoted, or dismissed.[14]
And similar considerations went into whether an applicant to competitive colleges, universities, and professional schools would be accepted or rejected. It was advantageous to be able to check off “Hispanic” on your application forms, much better to be able to check off “Black,” but a disadvantage to have to check off “White.”
Asians experienced a reversal of fortunes. In the late 1960s and early 1970s Asians for many college admissions departments were considered an ethno-racial minority and sometimes received similar preferential treatment as blacks and Latinos. As their numbers began to increase as a result of changes in U.S. immigration laws, and as groups such as the Chinese, Koreans, and Asian Indians began their meteoric advance in terms of academic achievement in the nation’s high schools, they increasingly came to be treated as whites for purposes of college admissions. And by the 1990s, their advance had been so spectacular that the Ivy League schools surreptitiously instituted ceiling quotas much as they had done against the Jews a half century earlier.[15] While Asians came to represent 40 percent of the entering class of the California Institute of Technology—the lone merit-only holdout among elite institutions—all of the Ivy League colleges, despite being flooded with outstanding Asian applicants, kept the total number of incoming Asian students to less than half of this percentage. Like the Jews before them, Asians were often judged under higher standards than members of other ethno-racial groups.
The spectacular Asian advance has also provoked opposition to the meritocratic ideal at some of our nation’s elite public high schools including Stuyvesant High School in New York City and Lowell High School in San Francisco. These schools were especially established for the most intellectually gifted and the highest academic achievers, with admission determined by a standardized test and, in the case of Lowell, by junior high school grades and a written essay. At Stuyvesant and seven other New York City exam schools, admission by state law must be based “solely and exclusively by taking a competitive, objective, and scholastic achievement exam” known as the Specialized High School Achievement Test (SHSAT). Asians in recent years have constituted between 60-74 percent of the Stuyvesant student body, though Asians constitute only about 12 percent of the New York City public school population. There is, in the case of Asians in New York City today, almost an exact replication of Jewish overrepresentation in entrance to CCNY in the 1930s and 1940s, though Asians constitute a substantially smaller portion of New York City’s total population than the Jews did when they dominated CCNY. Like the earlier Jewish population, many of the Asians are offspring of recent immigrants who do not speak English in the home and who come from very modest or even impoverished circumstances.
There have been strong protests against the admission procedures at Stuyvesant and other New York City specialized high schools largely, it seems, because the Asians do so well on the SHSAT, and both Hispanics and blacks, who constitute almost 70 percent of the city population, do very poorly in comparison. The test itself is a fairly standard assessment of verbal and mathematical achievement, but only a tiny proportion of Hispanics and an even smaller proportion of blacks attain scores comparable to the highest-achieving Asians. In 2019, Stuyvesant sent out 895 offers of admission to those with the very highest scores citywide on the SHSAT exam. Only 33 of these (3.7 percent) went to Latinos, and only seven offers (0.8 percent) went to blacks. Asians, in contrast, received 587 admission offers (65.6 percent) and whites 194 (21.7 percent).
Despite the great achievement of New York City’s exam schools in turning out many high achievers—four Stuyvesant High School graduates have won Nobel Prizes and countless other graduates of Stuyvesant and the other specialized exam schools have gained entrance each year to Ivy League and other top colleges—New York City’s mayor Bill de Blasio finds the skewed ethno-racial results intolerable. He has proposed scrapping entirely the idea of a standardized test for entry to schools for the gifted to be replaced with a “percent plan” in which every middle school in the city is guaranteed that seven percent of its highest scoring graduates gain entry to one or more of the city’s exam schools. Since neighborhood ethnic patterns produce many schools dominated almost entirely by members of one ethno-racial group or another, and since many of the schools dominated by Latinos and blacks have few students who display academic records comparable to those of even the high-middle scoring Asians and whites in certain other neighborhood schools, implementation of such a “percent plan” would guarantee that much larger numbers of blacks and Latinos would gain entry to the elite exam schools at the expense of much more academically accomplished whites and Asians. Estimates place the Asian reduction at over half. There would also, of course, be much lower overall academic standards for the schools.
Needless to say, the de Blasio plan has drawn fierce opposition from spokesmen for Asian groups as well as from members of the alumni organizations of the individual exam schools. Just as CCNY ceased to be an institution turning out superlative students once it scrapped its high standards for admission, Stuyvesant and the other specialized exam schools face a similar decline as institutions for educating New York City’s most gifted if they adopt the de Blasio “percent plan.”
Despite the opposition by some groups to using academic talent and achievement as the sole or overwhelmingly dominant basis for admission to the nation’s most prestigious educational institutions, public support for such meritocratic policies remains strong. Concern over “diversity” and ethnic representation principles, by contrast, is very weak. Many on the left are shocked by the polling data, which show substantial support for merit-only policies even among Latino and black groups. An excellently crafted inquiry from the Gallup organization asked this question: “Which comes closer to your view about evaluating students for admission into a college or university: a) applicants should be admitted solely on the basis of merit, even if that results in few minority students being admitted; or b) an applicant’s racial and ethnic background should be considered to help promote diversity on college campuses, even if that means admitting some minority students who otherwise would not be admitted?” In a national sample of adults taken in 2016, 70 percent chose the “solely on merit” choice including 76 percent of whites, 61 percent of Hispanics and 50 percent of blacks. A near identical pattern was shown the previous three times Gallup asked the question (2001, 2007, 2013).[16]
A more recent poll by the Pew Research Center (February 25, 2019) showed similar results. Among all American adults, 73 percent said that race and ethnicity “should not be a factor in college admissions decisions” including 78 percent of whites, 65 percent of Hispanics, and 62 percent of blacks. Only 4 percent of whites, 11 percent of Hispanics, and 18 percent of blacks said that race and ethnicity should be “a major factor” in admissions decisions (The rest who supported the use of racial and ethnic considerations said they should only be “a minor factor.”) The Republican/Democratic split was very different than many would suppose with Democrats supporting the “not a factor” choice by a substantial majority (63 percent) and Republicans supporting it overwhelmingly (85 percent).[17]
There is, of course, an enormous split between the opinions of ordinary citizens on this issue and that of certain elites, especially high-level college presidents and many CEOs of large corporations. According to political theorist Michael Walzer, it is in part because of this disconnect that policies of racial preferences in employment and education are so often shrouded in secrecy and lies. “In our culture,” Walzer writes,
careers are supposed to be open to talents; and people chosen for office will want to be assured that they were chosen because they really do possess, to a greater degree than other candidates, the talents that the search committee thinks necessary to the office. The other candidates will want to be assured that their talents were seriously considered. And all the rest of us will want to know that both assurances are true. That’s why reserved offices in the United States [i.e., ethnic quotas] have been the subject not only of controversy but also of deception. Self-esteem and self-respect, mutual confidence and trust, are at stake as well as social and economic status.[18]
The meritocratic ideal remains strong among large segments of the general public, and the colleges and universities who trade in racial preferences know this—which is why they do all they can to conceal what they are up to. They can’t acknowledge that it may be a huge advantage to be black when applying for a job or a place in a university, and a huge disadvantage to be white and often, in the college admissions context, a still greater disadvantage to be Asian.
Several years ago I wrote a book arguing that the continued existence of policies of racial and ethnic preference in university admissions and employment constituted “wounds that will not heal.”[19] Nothing has changed since then and it can be argued that the racial and ethnic polarization in America has become even greater in the Obama and Trump eras.
Racial and ethnic preferences, and the differences between what might be called "outcome-of-the-game-equality" versus “entry-rights-to-the-game-equality," is a major part of this continuing polarization. It was this second kind of guarantee—equal game-entry-rights—that Branch Rickey introduced into major league baseball when he called up Jackie Robinson from the old Negro Leagues to join the Brooklyn Dodgers. Given that—for a host of reasons—a) groups in a free society will always differ in their average outcomes and performance levels, and b) that many of the difference-generating factors are ones governments cannot or should not try to change, Charles Murray offers perhaps the fairest selection principle with deep resonance among the American public: "The best and indeed only answer to the problem of group differences,” Murray writes, “is an energetic and uncompromising recommitment” to a policy of judging everyone “on his or her own merits.”[20] It’s the Jackie Robinson Principle, which the Brooklyn Dodger organization pioneered in baseball almost three-quarters of a century ago. Robinson himself articulated it in simplest terms in a 1972 interview with Dick Cavett: Baseball players, Robinson said, “should be judged solely on their abilities and race shouldn’t have anything to do with it.”[21]
Murray calls it the principle of “individualism,” and he says at one time it was “thought to be un-American” to reject it. He was referring here to the “color-blind” period of the Civil Rights Era and its precursors in the thought of people like Frederick Douglass and Supreme Court Justice John Marshall Harlan. Murray urges that we return to what was previously considered the morally and socially advanced view on this matter. If recent polls are to be consulted, a substantial majority of Americans, if not their governing elites, have never wavered in their support of “individualism” as Murray understands it. The Jackie Robinson Principle was sound in 1947 and it is just as sound today. Careers should be open to the most talented, most accomplished, and most promising—nothing more and nothing less. Race shouldn’t have anything to do with it. Everything else is commentary.
[1] Michael Young, The Rise of the Meritocracy (New Brunswick, NJ: Transaction Publishers, 1958/1994.
[2] When Mario Puzo told his mother of the enormous sum of money he had received for the sale of the manuscript to The Godfather, his mother, in her broken English, imparted to him the folk wisdom she had learned growing up in the semi-feudal conditions of Southern Italy: “Don’t tell nobody!”
[3] Frederick Douglas, “Self-Made Men,” published in 1872, www.monadnock.net/douglass/self-made-men.
[4] Thomas Sowell, Black Rednecks and White Liberals (San Francisco, CA: Encounter Books, 2005), 63.
[5] Peter Salins, Assimilation, American Style (New York, NY: Basic Books, 1997), 53, 61.
[6]” From a 1972 interview with Jackie Robinson on the Dick Cavett show, https://www.youtube.com/watch?v=YCr0RAzf8ds; Confronting a potential strike among players opposed to an African-American in the Major Leagues, Frick declared: “I do not care if half the league strikes . . . This is the United States of America and one citizen has as much right to play [professional baseball] as another.” Cited in Jonathan Eig, Opening Day: The Story of Jackie Robinson’s First Season, (New York, NY: Simon and Schuster, 2007), 95.
[7] Norman Podhoretz, cited in Michael Novak, The Joy of Sports (New York, NY: Basic Books, 1976), 175-176.
[8] Jerome Karabel, The Chosen: The Hidden History of Admission and Exclusion at Harvard, Yale, and Princeton (New York, NY: Mariner Books, 2006), 97.
[9] See on this Jerome Karabel, The Chosen: the Hidden History of Admission and Exclusion at Harvard, Yale, and Princeton (Mariner Books: New York, 2006).
[10] See the Wikipedia entry under “Jewish Quota-United States.” The medical school dean in question, Milton Winternitz, was himself a Jew and, one suspects, needed to prove that he was fully supportive of the wishes of Yale’s mainly WASP faculty, students, and alumni.
[11] Nathan Glazer and Daniel Patrick Moynihan, Beyond the Melting Pot, Second Edition (Cambridge, MA: M.I.T. Press, 1970), 156.
[12] Richard Herrnstein, Charles Murray, The Bell Curve (New York: NY: The Free Press, 1994), 30.
[13] Executive Order 11375 (1967) issued by President Lyndon Johnson.
[14] Nathan Glazer, Affirmative Discrimination (Cambridge, MA: Harvard University Press, 1975/1987), 31.
[15] See on this Ron Unz’s article “The Myth of American Meritocracy,” The American Conservative, November 28, 2012. Pay special attention to the graph of the proportion of Asians at the eight Ivy League schools and at CalTech over the time period from 1990-2011.
[16] Frank Newport, “Most in U.S. Oppose Colleges Considering Race in Admissions,” Gallup , July 8, 2016.
[17] Nikki Graf, Pew Research Center, February 25, 2019, survey of 6,637 adults, www.pewreserach.org.
[18] Michael Walzer, Spheres of Justice (New York: Basic Books, 1983), 152-153.
[19] Russell Nieli, Wounds That Will Not Heal (New York: Encounter, 2012).
[20] Charles Murray and Richard Herrnstein, “Race, Genes and I.Q—An Apologia,” The New Republic, October 31, 1994.
[21] See the YouTube presentation under the title “Jackie Robinson on the Dick Cavett Show, 1972,” cited in footnote 6.
Image: Tyler Nix, Public Domain