The Humanities on Life Support

By Ellen Schrecker

The Last Professors: The Corporate University and the Fate of The Humanities. Frank Donoghue. New York: Fordham University Press, 2008.

Higher Education? How Colleges are Wasting our Money and Failing our Kids— And What We Can Do About It. Andrew Hacker and Claudia Dreifus. New York: Times Books, 2010.

The Humanities and the Dream of America. Geoffrey Galt Harpham. Chicago: University of Chicago Press, 2011.

The Marketplace of Ideas: Reform and Resistance in the American University. Louis Menand. New York: Norton, 2010.

Not For Profit: Why Democracy Needs the Humanities. Martha C. Nussbaum. Princeton, NJ: Princeton University Press, 2010.

The Humanities and the Dream of America
by Harpham, Geoffrey Galt
The Marketplace of Ideas
by Louis Menand
Not for Profit
by Martha C. Nussbaum







Higher education is in big trouble. Otherwise, why would so many energetic, intelligent, and concerned academics desert their traditional disciplinary pursuits to publish jeremiads about the pitiful state of the humanities and the institutions that purvey them? A recent review article in The Nation surveyed ten such volumes; this review deals with five, mainly different, ones; and literally dozens more are available on Amazon, if not at your local bookstore, and, no doubt, even more are in the works. It’s become something of a mass production industry—especially, I should note, among people in the field of literary studies.

Invoked with routine nonchalance by so many academic Cassandras, the term crisis has lost its meaning. As Frank Donoghue points out in The Last Professors: The Corporate University and the Fate of the Humanities, what we are seeing is no crisis at all but “an ongoing set of problems” that, at least for the men and women who entered graduate school since the 1970s, has become business as usual. Yet the term resonates in the present situation. If the academy is not currently on life support, many worry that, unless something changes radically, the post–World War II model of mass higher education, with its emphasis on the liberal arts, is on the verge of extinction or, at best, marginalization.

No single explanation will suffice. Still, as our authors insist, much of the trouble comes from a serious disconnect between the professoriate and the rest of the American public. That disconnect has eaten away at the overall support for higher education in general and the humanities in particular. Not only do ordinary citizens—not to mention politicians, pundits, and boards of trustees—entertain unrealistic notions about what college and university teachers do, but the public has come to see these faculty members as the prime source of the system’s defect du jour, whatever that may be.

American higher education has carried a heavy burden of expectations ever since it replaced the frontier as the nation’s main vehicle of economic mobility and became the fraying social safety net for an imperiled middle class. Nonetheless, it can still, if barely, sustain the meritocratic myths that undergird our increasingly inegalitarian society, as evidenced by our own president’s ascent through the web of elite institutions from a fancy (and free) Honolulu prep school to Columbia University, Harvard Law School, and beyond. But, in most people’s lives, the dream of an excellent and affordable college education has come to seem increasingly out of reach. Ever-steeper tuition and the burden of student loans are undermining the once uniquely American promise of near-universal access to higher education. And so we search for scapegoats.

The main culprit is the neoliberal political climate that for the past forty years has glorified the market while nibbling away at the public sector by turning “tax” into a dirty word. Increasingly starved for revenue, state legislatures have steadily reduced their share of public college and university budgets, thus forcing these institutions to rely increasingly on their students’ tuition. At the same time, in keeping with its ideological inclinations, the federal government has transformed its support for higher education from institutions to individuals, encouraging the recipients of its grants and loans to enter the now-commercialized academic marketplace.

The result: an increasingly competitive system of higher education permeated by the corporate values and inequities that pervade the rest of American society. Students are now consumers, courted by colleges and universities that market and even “brand” themselves by engaging in what Andrew Hacker and Claudia Dreifus in Higher Education? How Colleges Are Wasting Our Money and Failing Our Kids—And What We Can Do About It call an “amenities arms race.” They build ever-fancier fitness centers and residence halls while offering ever more job-centered academic programs. They struggle to attract faculty stars as well as students with high SAT scores in order to improve their rankings in the all-important US News & World Report survey.

In the process, the stratification that has always characterized the nation’s colleges and universities worsens. The gap between the schools that Donoghue calls the “brand-name universities” and the “mass-provider universities” is growing. The haves still offer the kind of traditional liberal arts undergraduate education that most Americans identify with college, while the have-nots, mostly second- and third-tier public institutions, as well as the growing for-profit sector, purvey more pragmatic programs specifically tailored to the job market. The wealthier institutions—the top-tier colleges and universities that Hacker and Dreifus label “the Golden Dozen”—amass the largest endowments, pay the highest salaries, and boast the most selective admissions.

Unsurprisingly, that selectivity, though ostensibly meritocratic, favors the well-off and well-connected, the young men and women from private prep schools and affluent suburbs who can pay up to $40,000 for a specialized college admissions “boot camp” or else earn a varsity letter in what Hacker and Dreifus call the “‘white’ . . . sports like ice hockey, golf, and crew.” Significantly, because of the economic security that their degrees presumably confer, the graduates of the elite institutions can still afford the luxury of a humanities major instead of a vocational one, thus obtaining intellectual, as well as economic, advantages from their elite educations. All our authors, in one way or another, deplore this situation. “No system of education is doing a good job,” Martha Nussbaum warns us in Not for Profit: Why Democracy Needs the Humanities, “if its benefits reach only wealthy elites.”

Unfortunately, too many critics both on and off the campus view the privileged world of the elite universities and colleges as the norm. After all, as Louis Menand notes in The Marketplace of Ideas: Reform and Resistance in the American University, these schools “have had the resources to innovate and the visibility to set standards for the system as a whole.” These standards have had a deleterious effect. Not only do they allow the public to view the professoriate as underworked and overpaid, but they have also inspired the rest of the academy to emulate them by emphasizing research. “Prestige gained by productivity in research,” Donoghue explains, “is the currency in which universities trade and is a concept that allows higher education and the corporate world to make sense of each other.” And, as in the corporate world, the competition is intense. Administrators at second- and third-tier colleges and universities, eager to raise their institutions’ status, press their faculties to obtain grants and publish at the same rate as their peers from Harvard and Stanford—without, of course, the same resources. The publications accumulate, but few get widely read.

In the process, teaching, until recently the main activity on most American campuses, has fallen by the wayside. Or, to be more accurate, it has fallen increasingly onto the shoulders of the teaching assistants, adjuncts, and non-tenure-track instructors who now handle roughly 75 percent of the undergraduate teaching load. The readers of Academe need no reminder about the execrable working conditions and inadequate remuneration of the men and (mostly) women with contingent appointments. For Hacker and Dreifus, outrage is the only response: “It is immoral and unseemly to have a person teaching exactly the same class as an ensconced faculty member, but for one-sixth the pay.” (Emphasis in original.)

But it’s not the “ensconced” faculty member’s fault. As the casualization of the academy increases, part-timers and temporary instructors are replacing the tenured professors whom Hacker and Dreifus inexplicably blame for academia’s sorry state. Because this is such a recent phenomenon, it has received little attention. Instead, Frank Donoghue notes, there is a “curious disconnection between the debate over tenure and the current realities of academic labor. . . . Tenure, however useful it might be as a hook for conversations about higher education, is becoming a mirage.” Yet it remains too tempting a target for the academy’s critics to ignore. As a result, for much of the public, as for Hacker and Dreifus, tenure represents everything that is wrong with the academic profession. Its holders, they contend, have a uniquely privileged sinecure that offers a guaranteed job for life, while requiring little more than a few hours of teaching a week and some kind of esoteric research, if that.

Though their hostility to the traditional professoriate distinguishes Hacker and Dreifus from the other authors whose books are under review, they are not alone in recognizing that some academics have blood on their hands. Focused on their own drive for status within their disciplines, few senior professors raised an alarm about their institutions’ growing reliance on adjuncts and non-tenure-track instructors. Whether they had the power to avert that situation is an open question. Still, Hacker and Dreifus’s contention notwithstanding, the tenured faculty did not cause it either.

Actually, as Donoghue explains, nobody did. “The abundant use of part-time, adjunct labor is a fairly recent phenomenon” that “was never part of a grand plan.” In fact, as two high administrators from Michigan admit in a recent book, despite their own responsibility for “academic appointment policy,” the use of part-time and temporary faculty members was a decision they “never consciously decided to make.” As they write, “the expanding role of non-tenure-track instructors was taking place under our noses but without our being fully aware of it.” It just happened—at different rates at different institutions. Rationalized as providing increased “flexibility,” contingent appointments became for academic administrators a way not only to save money but also to make an end run around the faculty. Nothing unique to academe, the casualization of the workforce is, Donoghue points out, “a familiar drama everywhere on the American labor scene.”

Another, and perhaps more legitimate, charge against the faculty concerns the content of scholarship. Our authors focus at considerable length on this issue. They believe that professional academics have become so specialized they can no longer communicate with a general audience. The “link between professors and public has been severed,” Hacker and Dreifus contend, “largely due to the constraints imposed by academic disciplines, indeed the divorce of academic knowledge from everyday understanding.” This problem is especially acute for scholars in the humanities, who, according to Geoffrey Harpham in The Humanities and the Dream of America, are “conflicted and confused about their mission, [and] suffer from an inability to convey to those on the outside and even to some on the inside the specific value they offer to public culture.” While this emphasis on the communications problems of the humanities underestimates the systemic elements in the academy’s current unpopularity, not to mention the political agenda of the university’s conservative critics, the charge does deserve some consideration.

To a certain extent, these authors—Harpham, Menand, and Donoghue in particular—are questioning the professionalization of their own field of literary criticism. Originally a somewhat aristocratic pursuit devoted to good taste and moral elevation, the study of literature, Harpham explains, was nonetheless “founded on the egalitarian presumption that the value of literature was available in principle to all competent readers without recourse to specialist training.” With the expansion of higher education in the mid-twentieth century, however, came pressure to do research and, thus, make the discipline more scientific and theoretical, more worthy of its practitioners’ place in the ivory tower.

First came the “New Criticism,” which, according to Harpham (citing Menand), “reinforced the sense that the study of literature was a discipline like others, and since it sought to be rigorous, precise, and empirical in its methods . . . had the effect of accommodating literary criticism to the ethos of science.” Then, from 1970 to 1990, followed a preoccupation with theory that Menand describes as “the intellectual and institutional equivalent of a revolution” that “helped to make the rest of the academic world alive to issues surrounding objectivity and interpretation, and to the significance of racial and gender difference.” Harpham’s take is more negative; he views those twenty years as “the reign of antihumanistic high theory in the humanities.”

Replete with neologisms and deliberately opaque language, this scholarship not only baffled—and often irritated—the general public but also put off many potential academics. Harpham expresses the consensus among most of our authors that the theoretical turn “exacerbated an already problematic structural division within literature studies between the undergraduate and graduate programs, and ha[s], as an unintended consequence, weakened the discipline as a whole.” Students initially drawn to the English Department by their love of reading found themselves alienated by the narrowness of its members’ research, by what Hacker and Dreifus describe as the “knowledge professors create for other professors.” In the process, public approval of the humanities has dwindled. People cannot support what they cannot understand.

Of course, theory was hardly the only change the 1970s brought to literary studies. Our authors overlook its concurrent transformation in response to the social movements of the period—the flowering of feminist criticism, African American studies, gay and lesbian studies, and all the other areas in which the field opened itself up and became more relevant to academia’s new nonelite population. Not surprisingly, those changes brought opposition from traditionalists within the discipline as well as from political conservatives outside it who hoped to roll back the sixties. As scholars in my own field of American history can attest, English departments were not the only targets in this well-funded and highly politicized attack on modern scholarship. In the eyes of a conservative critic like Lynne Cheney, historians who do not extol the wisdom of the Founding Fathers can be labeled unpatriotric or worse.

For these reasons, the panaceas most of the books under review suggest seem slightly off the mark. Hacker and Dreifus believe that the best way for the professoriate to regain the public’s trust is to abandon most research. Menand, Donoghue, and Harpham reject such a drastic solution, but they do believe that something is amiss when graduate students in literary studies are trained to produce arcane scholarship they will never use in their teaching—if they are lucky enough to find a job. Accordingly, these authors call for a reorientation of the field, both to integrate its concerns with those of other disciplines and to restore the sense of pleasure and wonder that the humanities can evoke. For Menand, such a reorientation might attract different types of graduate students, more worldly perhaps and more able to bridge the gap between academe and the rest of American society. Good luck, but I remain skeptical.

Here we come to the main challenge confronting those of us worried about the fate of American higher education: how to justify—and thus win support for—the liberal arts. Platitudes abound. Hacker and Dreifus explicitly eschew language about “critical thinking” or “moral reasoning,” and yet they are as emphatic as any literary scholar about the need for today’s undergraduates to become broadly educated, not just narrowly trained. “College should be a cultural journey,” they insist, “an intellectual expedition, a voyage confronting new ideas and information, together expanding and deepening our understanding of ourselves and the world.” They—and our other authors as well—want all students to get a grounding in the liberal arts.

The humanities have never been easy to sell. Before higher education democratized in the mid-twentieth century, the humanities were treated as a cultural adornment, something that served as a marker of one’s class position. Once the United States embraced the project of mass higher education, however, the arts and sciences needed a stronger rationale than an ability to enliven a dinner party. The academy, it was assumed, had a civic obligation to cultivate, in Harpham’s words, “moral citizenship.” Thus, the midcentury academics who developed that period’s general education programs believed that a common core of great books and works of art would create what Menand calls the “social glue” needed to unify the newly disparate student body. Not only that, but the process of exposing eighteen- and nineteen-year-old Americans to that curriculum would somehow assist in what Harpham describes as “the formation of a nation fitted to the task of ruling the world.”

Now, nearly seventy years later, such ambition seems naive, if not dangerous, and possibly even, to quote Harpham, “a mere provincial prejudice.” Later iterations, that the humanities have some connection to the nation’s security or economic well-being, ring equally false. And yet we remain convinced that studying Austen, Plato, or the French Revolution is more valuable—to the individual and to society—than learning about “Quantity Food Production” or “Beverage Management,” to cite two classes Hacker and Dreifus discovered in the hotel, motel, and resort management program of a second-tier university in New Mexico.

Since the evidence for the practical benefits of such job training is mixed, we must shift the argument to another terrain if we are to convince students and their parents to eschew pure vocationalism. We must make a stronger case that, as the title of Nussbaum’s slender volume puts it, the university is “not for profit”; the campus is not a marketplace. Economic calculations cannot, must not, supersede other values when it comes to higher education. It is not necessary to take an either-or position here. Though college graduates must earn a living, they must also live in a human—and humane—society.

The humanities, Harpham asserts, offer the knowledge that such an undertaking requires “an awakened understanding of oneself as a member of the human species, a heightened alertness to the possibilities of being human.” Along with that self-awareness, he notes, the humanities also create “the capacity to sympathize, empathize, or otherwise inhabit the experience of others.” Nussbaum agrees; the sympathetic “ability to imagine . . . the predicament of another person, along with the ability to think for oneself,” are the “skills that are needed to keep democracies alive.” Those capacities are, of course, even more essential in today’s globalized world. These are the strongest arguments around, but they may not be enough.

We may be asking the humanities to bear a heavier burden than they can carry. Especially at a time of financial distress, when so many schools are shedding their departments of philosophy, classics, and foreign languages, the future of the humanities looks grim indeed. And not just in the United States. Most poignantly for people in English departments, the British government has recently decided that all academic units must justify their research in economic terms. Worse yet, in 2009 the British Arts and Humanities Research Council, which administers this decree, was shifted into the government’s Department of Business Administration and Skills.

There are a few rays of hope, including some that might gladden the heart of a US chauvinist. For American higher education, despite its flaws, is one of the few systems that exposes its students to the humanities. Universities elsewhere usually offer vocational tracks from the start. But they are beginning to change. Educators in places like China and Singapore, where most higher education consists of narrowly focused technical training, are considering adding the humanities and other liberal arts disciplines to their curriculum in order to encourage their students to become more creative. It would be ironic, indeed, if just at the moment when other societies embrace a more humanities-oriented higher education, the United States abandons it.

Ellen Schrecker teaches history at Yeshiva University. A former editor of Academe, she has written extensively on academic issues and McCarthyism. Her most recent book is The Lost Soul of Higher Education: Corporatization, the Assault on Academic Freedom, and the End of the American University.

Add new comment

We welcome your comments. See our commenting policy.

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.