January-February 2003

The Radicalism of the Liberal Arts Tradition

Can liberal education survive in a university increasingly committed to the ideals of the market, the corporation, and the entrepreneur?


For some time now, critics of American higher education have depicted it as caught up in a cultural war between politically correct leftists inside the university and neoconservative curmudgeons outside it. According to this account, the curmudgeons argue that the pursuit of free intellectual inquiry—the traditional mission of the university—is under unprecedented attack from prissy speech codes and politicized professors unconcerned with older standards of objectivity. The leftists respond that curricula are more diverse, open, and vital than ever before.

This argument obscures more than it clarifies. Contrary to received opinion, the chief threat to intellectual freedom in the academy is not political correctness, though the tyranny of various ideological fashions (right and left) is real, and can be oppressive. The main menace is market-driven managerial influence: the impulse to subject universities to quantitative standards of efficiency and productivity, to turn knowledge into a commodity, to transform open sites of inquiry into corporate research laboratories and job-training centers.

The attempt to turn universities into businesses challenges the conservative understanding of the humanities. If the liberal arts tradition is understood as a worldview, rather than a collection of courses, it poses a radical challenge to the managerial impulse—far more radical than self-proclaimed traditionalists like former secretary of education William Bennett realize. If we want to sustain and revitalize our concept of "what the university is for," we need to recognize the radicalism of the liberal arts tradition.

In the early decades of the last century, the great American philosopher William James penned a capacious definition of the liberal arts tradition in his essay "The Social Value of the College-Bred." The gentility of the essay's title belies the radicalism of its implications. This is part of what James had to say:

You can give humanistic value to almost anything by teaching it historically. Geology, economics, mechanics are humanities when taught with reference to the successive achievements of the geniuses to which these sciences owe their being. Not taught thus literature remains grammar, art a catalogue, history a list of dates, and natural science a sheet of formulas and weights and measures.

The sifting of human creations!—nothing less than this is what we ought to mean by the humanities . . . . Studying in this way, we learn what types of activity have stood the test of time; we acquire standards of the excellent and durable. All our arts and sciences and institutions are but so many quests of perfection . . . and when we see how diverse the types of excellence may be, how various the tests, how flexible the adaptations, we gain a richer sense of what the terms "better" and "worse" may signify in general. Our critical sensibilities grow both more acute and less fanatical . . . . What the colleges . . . should at least try to give us, is a general sense of what, under various disguises, superiority has always signified and may still signify. The feeling for a good human job anywhere, the admiration of the really admirable, the disesteem of what is cheap and trashy and impermanent—this is what we call the critical sense, the sense for ideal values. It is the better part of what men know as wisdom.

A Frame of Mind

James was not talking about creating what we now call "intellectuals," though he uses that word in his essay. He was talking about people who aimed to sustain the capacity for independent thought. Nor was James striking the familiar stance of the threatened humanist in a world of machines. His was not a backward-looking vision. "We must shake the old double reefs out of the canvas," he said, "into the wind and sunshine, and let in every modern subject, sure that any subject will prove humanistic, if its setting be kept only wide enough." The liberal arts tradition, in other words, was not a specific set of subjects but a habit of thought, a frame of mind.

This frame of mind can most accurately be characterized in phrases that have come to seem threadbare: the effort to cultivate discriminating sympathy, to combine a capacity for appreciation with the critical spirit. This is what makes teaching a subversive activity and the university a shelter for intellectual freedom. The platitudinous language of the college catalog refers to a worthy and fragile ideal—one that has rarely been in more danger than it is right now, in the can-do corporate culture of the United States.

Our current plight can be traced back past the 1960s to the early twentieth century, when universities began to embrace the Prussian ideal of productive scholarship. The production model required that the completion of scholarly tasks be certified by certain documents, called "degrees." Prussian productivism melded with American vocationalism and anti-intellectualism—the love of the practical, the demand for cash value. The result was the accentuation of a fundamental conflict in the university's mission between furthering the pursuit of truth and serving the needs of established power. The modern American university was to continue to preserve a place for the free play of ideas, but also to provide technical expertise for government and business elites.>

The marriage of Prussian productivism and American vocationalism produced a monstrous spawn. James called it "the Ph.D. octopus." He meant the proliferation of credentials increasingly required for admission to the managerial professional class, credentials that all too often became substitutes for substantive thought. Many graduate degree programs were characterized from the outset by a robust anti-intellectualism. This was especially marked in graduate business education. When it was suggested to the dean of a midwestern graduate business school in 1907 that he offer courses on the problems of trade unionism, he said, "We don't want our students to pay any attention to anything that might raise questions about management or business policy in their minds." Political correctness is nothing new on American campuses.

James knew that even management theory could be taught as one of the humanities; it did not have to be shackled to a narrow utilitarianism. But that is in fact what happened in the first half of the twentieth century, as soft and hard science alike acquired unprecedented prestige in the academy. One reason for their phenomenal growth was that they could serve as handmaidens to corporate and state power, as sources of data for the sorting and categorizing institutions that were increasingly channeling "human resources" into productive purposes. Indeed, the very term "human resources" (like "human capital") was a coinage of the utilitarian consensus. The problem with this utilitarianism was its frequent detachment from any overarching aims and purposes; a focus on technique distracted attention from larger goals. Means obscured ends. The nuclear arms race was the classic illustration of the technocratic madness that the novelist Herman Melville captured in summarizing his character Ahab's self-knowledge: "All my means are sane; my motive and my object are mad."

The liberal arts began, in this atmosphere, to seem dilettantish by comparison to the hard-nosed managerial disciplines, with their problem-solving ethos and their convincing simulation of a scientific spirit. World War II and the Cold War hastened the triumph of the managerial outlook. Reading itself became a weapon of the Cold War, as Reader's Digest explained in an essay titled "Why Johnny Can't Read and Ivan Can." The reason for this alarming state of affairs, according to most critics, was a bland agenda of "life adjustment," wherein students received credit for solving adolescent social problems instead of learning basic skills. The Soviet launching of the Sputnik satellite in 1957, presidential candidate John Kennedy's shrill warnings about a (nonexistent) "missile gap," and President Kennedy's insistence that the United States would beat the Soviets to the moon—all of these developments intensified anxiety about the state of American education and ensured that reform efforts would take a narrowly technocratic direction. The national security state combined with corporate behemoths to promote an impoverished, utilitarian conception of the university's social role. For many educators, the university became a "knowledge factory," which was the phrase University of California, Berkeley, chancellor Clark Kerr used to describe his campus and other comparable research universities in a 1959 commencement speech.

Language of Resistance

Maybe it was no accident that some of the most raucous and best-documented student protests of the subsequent decade occurred at Kerr's own university. A lot of anti-intellectual nonsense was involved in the protests, at Berkeley and elsewhere, but the anti-intellectualism was really a sideshow. The main events were the civil rights and antiwar movements and the counterculture they energized—and these movements demonstrated the radical strength of the liberal arts tradition. They showed its uses as a resource for resistance to illegitimate power. The slogan of the free speech movement, which was the initial phase of student protest at Berkeley, underscored the relationship between education and politics: "I am a human being. Please do not bend, fold, or mutilate." Those words, worn on protest buttons, were adapted from the instructions on the cards that students used to register for courses back in those days of primitive, mainframe computers. They signified the dawning awareness of the connections between a foreign policy dictated by technocratic imperatives and an educational policy dedicated to batch-processing students. As the awareness spread beyond Berkeley, it developed into a forceful, complex critique of bureaucratic rationality. That critical perspective remains the most enduring legacy of the 1960s.

Contemporary critics of the university have joined the national ritual of the last twenty years: trashing the sixties, reducing a multifaceted cultural movement to the self-indulgent gropings of overprivileged kids at elite universities. Let me suggest a different perspective, even if I have to risk the confessional mode to do it.

I went to a conservative, southern school, the University of Virginia. I joined the Naval Reserve Officer Training Corps. I never went near a Students for a Democratic Society meeting. And yet I was profoundly affected by the antiwar counterculture. For me at least, the second half of the 1960s was a great time to study the humanities. It was probably also a great time to teach the humanities. Enrollments were soaring in philosophy, literature, and history. The Vietnam War made us confront urgent ethical dilemmas. It made us ask ultimate questions about meaning and purpose in our lives. Sometimes we asked those questions sophomorically. (Many of us, after all, were sophomores at the time.) Yet we challenged the implicit denial of meaning and purpose that was embedded in the managerial ethos of our policy makers. We read canonical authors, whom our professors assured us were "major": Melville, Faulkner, Shakespeare. They helped us understand what we were up against: the proud man's contumely, the insolence of office. They helped us challenge that insolence; they gave us the language of resistance. "Poetry makes nothing happen," the poet W. H. Auden said. He was mistaken. In my own mind and those of my contemporaries, poetry made something happen. Tradition proved it had a radical edge.

Now that edge is considerably duller, worn away in part by the big lies of Bennett and other pseudotraditionalists. Humanities enrollments are down. Who wants to study a collection of stodgy, unchanging masterpieces preserved in amber? Who wants to study dead white males who do nothing but shore up the status quo, who teach us we're living in the greatest country in the world—no, the greatest country in the history of the world? That is not what the liberal arts tradition does, despite the mistaken assumptions animating Bennett's discussion of the Federalist Papers, or former National Endowment for the Humanities chair Lynne Cheney's vision of the American history curriculum. Anyone who bothers to investigate will find that the humanities curriculum is more capacious and vital than it was in the 1960s, because it includes nonwhite, nonmale, and non-Western texts. But on most campuses consumer demand is elsewhere.

Market Pressures

The reasons for this change are complex. After the Vietnam War and especially after the collapse of the Soviet Union, the link between national security and the knowledge factory loosened a little. The private sector and its needs came to the fore. In such documents as A Nation at Risk, the much-cited report issued by the National Commission on Excellence in Education in 1983, the university was charged with failing to produce and prepare the labor force of the twenty-first century. By the end of the 1980s, the market had acquired the stature accorded to God in medieval theology—the Primum Mobile, the First Cause, the Unmoved Mover, the oracle to which all questions had to be referred. And universities fell into line with the resurgent managerial ethos. They began to behave more like corporations.

The evidence for this change is all around us. Some of the most egregious examples involve "cybermania," the current fascination with the virtual classroom. No one can deny the value of Internet databases, course Web sites, and the like. Out of the technological ferment much good will come. There will also be a lot of waste. At every major university in the country, one can still see enormous closed-circuit television sets looming over lecture halls, gathering dust, never used, relics of an earlier period of technophilia. Much of the expensive hardware we are now purchasing will one day meet the same fate (indeed, some of it already has).

But the problem with cybermania is subtler than the squandering of resources that might better be used elsewhere. It involves, as well, an attempt to substitute technology for the human interchange that happens (or should happen) in the classroom. Any use of computers that undermines face-to-face contact is potentially destructive to education. Distance learning is to learning as phone sex is to sex: it may be better than no learning at all, but you wouldn't want to confuse it with the real thing.

It is important to see administrators' enthusiasm for distance learning as part of a long-term strategy for containing labor costs in the academy. As early as 1994, Educause, a consortium of 1,600 academic institutions and 150 corporations, produced a "national learning infrastructure" initiative, according to "The Kept University," an article by Eyal Press and Jennifer Washburn that appeared in the Atlantic Monthly in 2000. The initiative was a detailed study of what professors do, breaking down which discrete functions could be automated or outsourced for "productivity enhancement." In the university, as in the corporation, information technology is a means of controlling budgets (provided that professors don't fight back by demanding copyright over their lectures and classroom materials).

The drive to enhance productivity is even more apparent in universities' turn to part-time and temporary employees who do not require expensive benefits packages. The adoption of this familiar corporate tactic has transformed the status of the faculty. Since 1975, the number of non-tenure-track jobs has increased by 88 percent, while tenure-track positions have declined by 9 percent. The epidemic of insecurity in the academic job market has intellectual as well as economic consequences. Consider the recent comments of the distinguished anthropologist Clifford Geertz, in an address to the American Council of Learned Societies. His own academic work, he reminisced, had been "mercurial, various, free . . . and not all that badly paid." But now that kind of "errant career" seems less and less possible:

There does seem to be a fair amount of malaise about, a sense that things are tight and growing tighter . . . and that it is probably not altogether wise just now to take unnecessary chances, strike new directions, or offend the powers. Tenure is harder to get . . . and the process has become so extended as to exhaust the energies and dampen the ambitions of those caught up in it . . . . All I know is that, up until just a few years ago, I used to tell students and younger colleagues that they should stay loose, take risks, resist the cleared path, avoid careerism, go their own way, and that if they did so, if they kept at it and remained alert, optimistic, and loyal to the truth, my experience was that they could . . . have a valuable life, and nonetheless prosper. I don't do that any more.

Contrary to contemporary business mythology, economic insecurity does not promote intellectual creativity. Absence of tenure does not necessarily help people keep their edge, in or outside the academy.

Shortsightedness

But the managerial mentality is not really concerned with the quality of intellectual life. Dependence on short-term faculty is a symptom of a short-term managerial perspective, an emphasis on quick payoffs, which has the same impact on the university that it does on other workplaces: the erosion of loyalty and of long-term commitments. Temporary faculty have no incentive to develop long-term relationships with students or to acquaint themselves with the enduring goals of the university—if there are any enduring goals left.

The short-term perspective has an equally corrosive impact on research, even in laboratory science. Paul Berg, a Nobel Prize-winning biochemist at Stanford, laid the crucial foundations for splicing DNA to make hybrid molecules—research that led to the creation of the first recombinant DNA clones. This is the sort of work that underlies many partnerships between universities and biotechnology firms. But, according to Press and Washburn, Berg observes that

[T]he biotech revolution itself would not have happened had the whole thing been left up to industry. Venture-capital people steered clear of anything that didn't have obvious commercial value or short-term impact. They didn't fund the basic research that made biotechnology possible.

In the past, freedom from market pressures allowed the university to fund the kind of research that had no immediate commercial application but ultimately led to unexpected rewards. Can this sort of open-ended curiosity survive in a cultural climate that encourages professors to think like entrepreneurs? It can only if universities refuse to tailor their research agendas to the needs of industry and reassert the core value of the liberal arts tradition: the pursuit of truth for its own sake.

Market constraints threaten the curriculum as well. Reports from Framingham State College in Massachusetts and George Mason University in Virginia show just how serious the situation has become. Framingham State dissolved its chemistry and philosophy departments. George Mason identified degree programs in German, Russian, classics, and other humanities departments as targets for extinction. The justification for these draconian measures is simple: consumer demand. Other disciplines (or pseudodisciplines) have burgeoning enrollments: computer science, mass communications, and management training are packing them in. The humanities are simply not maintaining a market share.

There are two problematic assumptions underlying this market-based pedagogy. The first is that students are sovereign consumers. This is absurd. Students are no more sovereign than any other consumer in an oligopolistic economy. In fact, as most professors have observed, students often tend to be passive and nearly inert, bumping aimlessly from one requirement to another, fearful and confused about the future, anything but avid consumers. The second assumption is that the faculty have nothing of their own to offer—no independent authority or disciplinary tradition. They are merely employees in a bureaucratic service economy, according to the corporate mantra, training the labor force of the twenty-first century—unless, of course, they manage to become Internet entrepreneurs, marketing Shakespeare online to retirees in Palm Beach. To do that they have to embrace the market model themselves, to fight for intellectual copyright and transform the notion of a "university community" into an entrepreneurial war of all against all.

Academic Crisis

So this is what we are up against in the fight to preserve and vivify the life of the mind in the universitynot a handful of old elitists, as leftist academics charge, but a mob of middle-aged managers. And both sides in the culture wars ignore them. Meanwhile, as the character Willy Loman in Death of a Salesman said, "The woods are burning." Faculties are debating whether to add another course in women's studies, another course in African American studies, while, at some institutions at least, whole departments and disciplines, with long and rich intellectual traditions, are being eliminated. The situation is serious. What is to be done?

We humanities faculty need to examine our own sacred cows and not just ritually defend them. And that includes the sacred cow of tenure. I am a product and beneficiary of tenure; I support it wholeheartedly. But we need to think about how the original rationale for tenure has been undermined by the managerial ethos. Tenure was meant to protect professors with unpopular ideas from troglodyte legislators and pea-brained administrators. But if we are not engaged in sustaining, criticizing, and debating intellectual traditions, if we are merely providing vocational training on consumer demand, the original rationale for tenure becomes problematic, and we have to formulate a new one or find some honest way to exhume the old one.

The contemporary academic crisis is not about job security any more than it is about how many classes are online or which departments get the most resources. It is about the attitudes we take to our most important audience, a nonacademic audience. Professors are constantly berating themselves and being berated for withdrawing into the insular world of scholarship, for not connecting with the real world. The real world is right in front of us, in the classroom; it is composed of students, 99 percent of whom have no intention of entering the academy themselves. They are a nonacademic audience, and they require us, however implicitly and imperfectly, to become public intellectuals.

In thinking about how to approach their challenge, we might consider some observations made by the Kentucky farmer and writer Wendell Berry in his essay "Discipline and Hope." He distinguished between field crops and tree crops and pondered how that distinction might inform an educational philosophy:

An index of the health of a rural community—and, of course, of the urban community, its blood kin—might be found in the relative acreages of field crops and tree crops. By tree crops I mean not just those orchard trees of comparatively early bearing and short life, but also the fruit and nut and timber trees that bear late and live long. It is characteristic of an unsettled and anxious farm population—a population that feels itself, because of economic threat or the degradation of cultural value, to be ephemeralthat it farms almost exclusively with field crops, within economic and biological cycles that are complete in one year. This has been the dominant pattern in American agriculture. Stable, settled populations, assured both of an economic sufficiency in return for their work and of the cultural value of their work, tend to have methods and attitudes of a much longer range. Though they have generally also farmed with field crops, established farm populations have always been planters of trees.

Good teaching is an investment in the minds of the young, as obscure in result, as remote from immediate proof as planting a chestnut seedling. But we have come to prefer ends that are entirely foreseeable, even though that requires us to shorten our vision. Education is coming to be, not a long-term investment in young minds and in the life of the community, but a short-term investment in the economy. We want to be able to tell how many dollars an education is worth and how soon it will begin to pay.

Calculating the payoff may be possible for field crops, but not for tree crops. And students are a kind of tree crop. In a managerial age of short-term perspectives, of techno-fixes and five-year plans, this approach to teaching and learning will always (I hope) be necessary and honorable, regenerative and even radical. This is what we need to remember if we want the liberal arts to re-main a living tradition, and not just a dream some of us had.

Jackson Lears is Board of Governors Professor of History at Rutgers University and editor of Raritan. A version of this essay was first published in the fall 2000 issue of the Hedgehog Review.