Posted in Speeches
Association of American Law Schools Annual Meeting
In 1932, an economist, a political scientist, and a lawyer walked into the New York governor’s mansion …
I promise this isn’t the start of a bad joke.
The governor, in this case, was Franklin Delano Roosevelt, and the three visitors were Columbia professors Adolf Berle, Rexford Tugwell, and Raymond Moley. With the nation reeling from the catastrophic fallout of the Great Depression, Roosevelt convened these men in the hope that their collective academic expertise might aid him in articulating a set of interventions that would serve the country and, of course, help him win the presidency.
The New York Times called the group Roosevelt’s “brains trust.” Others called it a joke.
The idea that college professors had anything to contribute to sound policy development struck many Americans at the time as utterly implausible. Roosevelt disagreed, memorably quipping: “The use of brains in the national government seems to me to be a pretty good practice.”
Time proved him right.
The “brains trust” went on to play a critical role in devising the architecture of the New Deal, and set the stage for the extensive post-war partnership between universities and the federal government.
It was a turning point—one of many in the history of American higher education, which began with a handful of fledgling colonial colleges and has expanded into the vast network of institutions that today stand as the envy of the world. And, as I will argue this morning, as that network has evolved, American higher education has accumulated a constellation of functions that have nourished and enriched liberal democracy.
In classrooms, our colleges and universities train democratic citizens.
In admissions and financial aid offices, they promote social mobility.
On their campuses, they create microcosms of pluralist society.
And in labs and libraries, faculty discover and disseminate new knowledge into the public sphere.
Through these functions, the university has come to stand alongside other core institutions like the free press, civic organizations, and an independent judiciary as one of the bulwarks of liberal democracy. The university was not progenitor to the birth of liberal democracy, but it has become simply indispensable to its flourishing. And today, as this room knows well, liberal democracy is confronting a series of challenges that many regard as profound, destructive, perhaps even existential.
The rising tide of anti-intellectualism and populism across the globe has fueled support for demagogues and authoritarians. In our own country, popular support for “army rule” is at its highest point in decades. And, for the first time in more than 70 years, the number of countries veering into autocracy now outnumbers those that are democratizing.
This backsliding is threatening not only individual nations but the post-war international order that liberal democracy forged, which has—even if imperfectly and fitfully—helped to usher in a healthier, more equitable, and more peaceful world.
Yet the allure of authoritarians and populists is not enough to explain liberal democracy’s vulnerabilities. The causes are many and varied. Plunging trust in democratic institutions. Rising partisanship. An alarming retreat into identity-based resentment and fear. Extensive disinformation campaigns that have put the reliability of basic facts into question. And a yawning and growing gap between rich and poor.
These trends cannot help but call into question the effectiveness of our core institutions in protecting the vitality of our democratic experiment. And clearly there has been no shortage of ink spilled by our colleagues on the frailties and failures of our bedrock institutions—the elected branches of government, media, and the judiciary—in securing liberal democracy’s promise.
But what about the academy itself? If we are, in fact, an indispensable institution for—and in—modern democracy, how well have we discharged our solemn responsibilities? Can we confidently claim a high ground in this moment of growing democratic malaise? I fear not.
So today, in the spirit of searching, dispassionate scrutiny that is the hallmark of our enterprise, I want to focus on four critical ways—civic education, pluralism and debate, social mobility, and knowledge creation—in which the university ought to foster liberal democracy, to identify the lapses in our performance, and to reflect on possible correctives.
In offering these remarks, my frame is the university writ large. But given that I suffer from the severe limitation of being a former law professor and law dean who is charged with leading a university that lacks (sorely) a law school, I am hoping that my co-panelists will be able to amplify or identify themes that I have missed and which have particular salience for legal education.
So, let me begin with the four key ways in which universities promote liberal democracy. The first is in the training and education of citizens.
Since the founding of this country, there have been calls for higher education to play a role in the formation of democratic citizens. In fact, George Washington was so committed to this idea that he devoted a substantial portion of his will—and a handsome sum of money—toward the founding of a university that would train young people in “the principles of politics and good government.”
American higher education has at times strived to achieve the ideal that Washington and others voiced. But it has done so imperfectly, tending to privilege certain aspects of citizenship at the expense of others.
For most of the nineteenth century, colleges sought to develop students’ moral faculties through a curriculum that culminated in a capstone course in moral philosophy. In the late nineteenth and early twentieth centuries, newly established research universities championed scientific reasoning as the cure for society’s ills and disciplines like political science emerged to teach citizens and train civil servants.
During World War I, government-mandated “War Issues” courses tried to balance free inquiry with patriotic drum-beating. These were short-lived, but they opened the door for the ambitious general education programs that flourished after World War II, which sought to inculcate democratic values and civic knowledge in students through a common set of texts and ideas.
Since the 1980s, however, the dominant paradigm for civic education at colleges and universities has shifted to service learning. This refers, broadly, to the array of opportunities for students to participate in direct volunteer service to local communities and causes. In my estimation, this movement has been a truly important one, and has done much to strengthen connections between students and the communities of which they are a part.
But as the source of a civic education, it is incomplete. It leaves untouched a knowledge of democratic history and political institutions, as well as many of the skills necessary to engage those institutions effectively to create lasting change—limitations that even proponents of service learning concede.
Law schools are one of the few places in the university firmament that have long done the work of training students rigorously in the finer points of civic knowledge and democratic practice, while also forging connections with communities through the clinic model. Yet since only a small fraction of undergraduates attend law school, the full burden of fostering the aptitudes of good democratic citizenship in college students has rested with our undergraduate programs. And in this charge, we are falling short.
Pluralism & Debate
A second way in which colleges and universities promote liberal democracy is by creating an environment that is a microcosm of pluralistic democracy and then encouraging students to interact with one another across differences.
Enshrined in the very idea of liberal democracy are, of course, the twin principles of expressive freedom and the protection of the rights of minorities. These ensure, in theory, a minimum guarantee of peaceful co-existence.
But genuine pluralism is more than mere co-existence. It requires interaction, dialogue, and vigorous contestation of values and ideas across a vast spectrum of experiences to forge democratic compromise, consensus, and will.
Historically, colleges and universities have been among the first institutions to offer citizens the opportunity to leave the communities in which they grew up and interact with others from different racial, religious, regional, socioeconomic, and political backgrounds. And over the decades, we have made undeniable strides in expanding access and addressing inequities, although that progress has admittedly come too slow and remains unfinished.
Our campuses in 2020 are far more diverse than in past eras. And yet, we do not fully or adequately encourage among students the interactions and exchanges across differences that are at the foundation of healthy democracies.
Indeed, universities in this moment are doing less than ever to draw students together.
We allow students to choose their own living arrangements, their own dining options, and their own classes. We have essentially given them a pass to opt out of encounters with people dissimilar from themselves. And even when those encounters do occur, there is a growing sense that they are superficial and fleeting, presenting little opportunity for self-reflection and reasonable, substantive disagreement.
This is representation without meaningful engagement. And it hasn’t always been the case. In the nineteenth century, the extra-curriculum at colleges was defined by literary and debate societies, which taught students the arts of dialogue and debate – something missing from our colleges today.
By asking ourselves how we create robust pluralistic communities on our campuses we can, I believe, go beyond and beneath the hot-button topics of whether campus culture is chilling speech or whether it’s permissible to shut down speakers.
This pluralistic frame encourages us instead to consider how we structure programs, campuses, and classrooms to promote what Harvard president Charles Eliot over a century ago described as a “collision of views … that teaches young people … candor, moral courage, and independence of thought.”
Moving on to my third point: Universities promote liberal democracy through social mobility.
Liberal democracy is premised on the notion that all people regardless of the station of their birth should have the opportunity to climb the social and economic ladder on the basis of their innate aptitudes and efforts.
In the 1830s, Alexis de Tocqueville celebrated mobility as essential to the democratic experiment. He rhapsodized that the United States was a place where the constant rise and fall of individual fortunes introduced “a host of ideas, notions, and desires to people that they would not have had if ranks had been fixed.” A century later, the historian James Truslow Adams codified this idea in the indelible term, the “American Dream.”
But we know that the dream of equal opportunity is more elusive than ever for many in contemporary America. Not only does the United States lag behind many of its peer nations in rates of mobility, but it has also experienced a notable decline in mobility at the upper and lower extremes of the income spectrum.
For most of their history, colleges and universities were heading in the direction of becoming the “practical equalizers of society,” as the famed minister Lyman Beecher called them in 1836. Up until the 1980s, access to college for people of all socioeconomic backgrounds was gradually expanding. This was achieved through the creation of public university systems and community colleges, through visionary legislation like the GI Bill and the Higher Education Act, and through the efforts of universities themselves to invest in financial aid.
But in the last 30 to 40 years, states have scaled back financial support for higher education; federal funding has stagnated and lost focus; and universities have embraced admissions practices that too often advantage wealthy students and disadvantage poor ones. These trends have unfortunately accelerated the stratification of higher education. To take one sobering statistic: most of the top universities in the country enroll more students from the top 1 percent of the income spectrum than from the bottom 60 percent.
Among university admissions policies, one of the most pernicious is legacy preferences, the bump given to the children of alumni at most selective colleges and universities in the United States.
Although academic institutions value deeply the contributions and commitments of their alumni, the practice of legacy preferences has had serious ramifications for our ability to launch people up the social ladder. More than just a nudge, these preferences can, at some institutions, be the equivalent of a 160-point boost on the SATs (out of 1600). At several elite universities, legacy applicants are three to six times as likely on average to be admitted as their non-legacy peers.
And this preference is often conferred on children who typically have had every possible benefit before attending college—stable homes, safe neighborhoods to grow up in, excellent education, and ample extra-curricular experiences. Nor are legacy students a trivial fraction of the student body at these schools. At some, legacies comprise up to a quarter of incoming students each year.
Along with events like last year’s admissions scandal, the persistence of legacy preferences is fueling the perception that colleges are bastions of entrenched privilege, and putting our role as the “practical equalizers of society” further at risk.
Facts & Checking
My fourth and final point is this: Universities promote liberal democracy through the discovery, interpretation, and dissemination of facts.
Liberal democracy depends on the availability of reliable knowledge and a shared sense of truth. It is how citizens make informed decisions as voters and community members; how legislators develop rational public policy; and how checking institutions like the free press hold leaders and governments to account.
Facts, in short, are essential to a well-functioning democracy. (And we can thank legal scholars for giving us the notion of the modern fact in the first place – the concept of indisputable “matters of fact” first emerged in sixteenth-century British jurisprudence before the sciences caught hold of the idea.)
With the founding of our first research universities in the 1870s, American higher education has been among the most important institutions for credentialing expertise; for conducting advanced research; and for unearthing, preserving and disseminating facts.
In time, democratic societies—despite periods of regress, as during the McCarthy era—came to embrace universities as beacons of truth. And over the decades, government support of research across the natural sciences, social sciences, and humanities has unleashed countless discoveries and strengthened the university’s role as an anchor for democratic life.
Yet, here, too, this relationship has frayed. Questions have surfaced from within and without the university about the objectivity, legitimacy, and accuracy of the academy as a locus of truth and facts.
Today, we are confronting ingrained suspicions among the public about the supposed partisan biases of faculty, frequent reports of scientific experiments that cannot be reproduced, and floods of questionable research proliferated in the form of predatory journals and bogus articles. And all this at a moment when the public is inundated with information and suspicious of expertise.
Through each of the four functions I have just outlined—civic education, pluralism & debate, social mobility, and knowledge creation—the university has secured its position as indispensable to the modern liberal democratic experiment. But these are also the very sites where it has stumbled.
If the university is to continue to vindicate its role as a core institution in liberal democracy, it must renew and reimagine these commitments.
For some of the areas I have discussed, the path ahead is, if not clear, then at least visible.
To be engines of social mobility, for example, universities should begin by critically examining admissions policies that disadvantage low-income students—starting with the elimination of legacy preferences. This is not an impossible task. Over the past several years, my own institution, Johns Hopkins, has already ended them.
Likewise, if universities are to reclaim their status as trusted arbiters of truth and fact, they should commit themselves to greater transparency in the research methods that faculty employ and the data they use, as well as more rigorous standards of publication review.
For the other areas I’ve discussed today—citizen formation and pluralistic interaction—the solutions are less apparent. But that should not deter us from this work of repair. Ours is a moment that calls for experimentation, research, and new thinking. And green shoots are already beginning to emerge.
On the civic education front, for instance, the University of Virginia’s School of Arts and Sciences this year approved a new general education curriculum that uses interdisciplinary coursework to foster 21st-century citizenship. And in a step towards encouraging more interaction among students from different backgrounds, Duke University ended its policy of allowing first-year students to choose their own roommates, and will be working with psychologists and other researchers to track this new policy’s effect on students.
Of course, in order to develop meaningful and lasting solutions to these issues, institutions must draw insights and experience from every facet of the university. Our law schools, in particular, must play a critical role in this process, because not only have you grappled so directly with many of these problems, but you possess the disciplinary expertise, the profound understanding of the challenge we face, and the vision and will to make real institutional change happen.
But one last question hangs in the air: Should the university be committed to the democratic project at all?
In 1899, William Rainey Harper, first president of the University of Chicago, made the provocative claim that the university is an institution “born of the democratic spirit” that has an innate responsibility to the democratic project.
I passionately believe that this claim still holds.
The university, as it has developed in this country over the past two centuries, enriches and is enriched by democracy; it is intertwined with democracy’s values and its ends.
Throughout its history the university has, at its best, stood in support of this project, responding to the exigencies of the world with vigor and vision. Now is a time when it ought to lean into its capacities to strengthen liberal democracy, and not shy away from them.
Declaring that the university has a responsibility to support democracy is not to say that it should take positions on particular causes, to align itself with partisan politics, or to step outside its academic mission.
Rather, from the way we manage admissions and aid, to the way we build the curriculum, foster the culture on our campuses, and conduct and disseminate research, we must acknowledge that universities are either contributing to or failing to contribute to liberal democracy, which has fueled so many of the most precious advances of modernity.
Our universities and, within them, our law schools, must act deliberately, with mission and, most importantly, with urgency.
The stakes are simply too high to do otherwise.