An academic plan is a business plan disguised in the regalia we don for significant public ceremonies—black cap and gown, colorful hood, and, of course, gold tassel. In 2009 and 2010, responding to the Great Recession, Mike Hogan, then president of the University of Connecticut, often announced: “An academic plan is useful during good times, but it’s especially useful now, because it helps [us] to determine where to invest.”
“The University of Connecticut is a bargain.” The chief financial officer, the president, and the chair of the board of trustees make a version of this statement every time they raise tuition. This year, in-state undergraduate expenses (including tuition of $8,250) are a mere $21,720 and out-of-state costs (including tuition of $25,000) are $38,600. That’s lower than any other public research university in the region and less expensive than most of the private universities “by leaps and bounds,” the chief financial officer is proud to announce.
It may be a bargain, but tuition has been going up—5.87 percent for academic year 2009–10, 5.36 percent for 2010–11, and just 2.44 percent for 2011–12. (This year, the governor and legislature made their views about tuition clear: it is not to rise faster than inflation.) Nonetheless, two years ago—for the first time ever—tuition funds provided more of the general budget than the state did. The state provides 29 percent of this year’s $1.04 billion–dollar operating budget, while tuition provides 33.9 percent. (In fiscal year 2009, the state provided 36 percent of the general fund and tuition contributed 30 percent.) Instead of increasing funding, the state has been calling back some of the university’s reserves and depositing them in the state’s general fund—more than $36 million in the last two years.
As of this writing, with the fall 2011 semester to start in a mere two weeks, UConn officials still do not know the projected size of the budget for the current fiscal year. That depends on whether the unions approve a deal that would balance the state’s budget. All that anyone will say is that there will be a deficit, probably somewhere between $10 million and $40 million—although the provost did announce that the size of classes and the number of contingent faculty will probably increase. The outside consulting firm will continue to look for cuts in the “nonacademic” sectors of the university. The state may sweep up more reserves. Departments may experience yet another round of rescissions. The hope is that salaries will at least remain the same. Officially, wages have been frozen for the past two years, but actually, they have decreased, since last year the state imposed seven furlough days and increased a variety of costs associated with benefits, such as medical copays and the amount contributed to the health insurance of people who had worked for the state less than ten years. As a new president announced in June 2011, “We are in better shape than most public research universities.”
The Academic Plan
Several years ago, the University of Connecticut started to plan for the economic disaster that was at the time so obviously in the future of higher education institutions. A formal “sunset policy” for departments was part of the academic plan enacted by UConn’s board of trustees in 2008. The plan—which circulated in drafts in 2007, before the official start of the recession—recognized “that maintaining . . . excellence in today’s highly competitive and fiscally challenging environment requires focused investment in units, programs, and activities that address key needs of the state and the students we serve.” These included a “sustained high level of demand from students,” addressing the state’s “workforce needs,” “an upward trajectory,” and articulation with the plan’s priorities.
A close look at the university’s preparations to shutter academic programs reveals the continuities between past and present. It also reveals how much “corporate planning” has penetrated academe. Once upon a time, not so long ago, presidents, provosts, and deans took the position that the professoriate was participating in the elimination of academic programs. Today, at many institutions administrators set up so many constraints on the committee members whom they appoint that these committees are essentially ratifying decisions that the administrators have already made.
Indeed, rather than encourage faculty participation in governance, administrators increasingly undercut faculty authority by implementing change from the top down. A former dean of the graduate school captured the “top-down spirit” when she confided to me, “I’ve made a discovery. I don’t have to argue with the Graduate [Faculty] Council about a course of action. I can control what they decide by choosing the information I tell them.”
The bureaucratic version of this practice is more formal, seems more objective, and personifies administrative control. Specifically, the procedures to measure each department’s contribution to the institution’s academic plan are a splendid example of controlling the committee’s decisions about outcomes by limiting the permissible inputs. A formalization of past practices that had recognized the faculty’s professional status, these bureaucratic rules also explicitly shift power to administrators. Additionally, they exemplify a politics of surveillance, control, and market management that disguises itself as the scientific administration of individuals and organizations.
Evaluating Departments Then and Now
Less than ten years ago, UConn’s deans and central administrators guided academic affairs (and assessed departments) more subtly than they do now. In the name of “improving” the quality of departments, deans and provosts worked with departmental administrators to conduct external reviews. Based on a professional model rather than a business plan, these reviews emphasized the participation of both a department’s members and respected colleagues at other institutions. To some extent, the process was a pretense that enabled administrations to do what they wanted, for everyone involved had learned to game the system. All parties hoped they could manipulate the rules to produce the result they desired. For example, everyone involved in the process knew that the choice of the peer departments with which a department was compared could shape some of the inferences that an external review committee might draw; moreover, a dean could shape conclusions by selecting some reviewers but not others from lists supplied by the department head. Sometimes professors even tried to game one another: they battled their departmental colleagues to decide the names on the list presented to the dean.
All this effort rarely resulted in much change unless a dean or provost wanted to alter a department significantly. Sometimes, the external review provided a rationale for action by a dean, but other criteria were also relevant. Thus, in 2004, the dean of UConn’s College of Liberal Arts and Sciences informed the board of trustees that he did not think it wise to continue to invest in a particular department because of tight budgets, declining student interest, and a poor external review. These same factors are among the criteria that college and university administrators are currently invoking to close departments. Before the Great Recession, administrators could state that a department had participated in the decision to eliminate itself, since it had provided a self-study to the external review team and had recommended potential team members to the dean. Now professors participate in panels that evaluate departments, but the pretense of shared governance has mostly disappeared. Today’s review process often is the application of measures in the service of a business plan, not an assessment of quality conducted by professional peers.
In February 2009, the provost charged a committee composed of respected senior professors known for their constructive contributions to university governance to assess the quality of each of the university’s sixty-five graduate and professional programs. “Quality” referred to “productivity”; that is, how well these programs were meeting the goals enumerated in the academic plan. Despite its pretentious name, the Committee for Excellence in Graduate and Professional Programs was a retrenchment committee dedicated to working within the limits the provost had established.
The table below presents some of the metrics used to express the productivity goals of UConn’s academic plan. Aimed at providing what the university calls “continuous improvement,” these metrics are essentially an effort to speed up work, with different goals pertinent to different departments. They are also an invitation to internecine warfare. Departments, colleges, and schools squabbled over which ones contributed the most to the university and were therefore deserving of better funding. In these treacherous financial times, many departments refused to recognize their mutual dependence.
Metric |
Baseline |
Five-Year Goal |
Undergraduate credit hours per faculty member |
422 |
470
|
Doctoral degrees awarded per one hundred faculty members |
19 |
23
|
Graduate and professional credit hours per faculty member |
80 |
90
|
Postdoctoral appointees per one hundred faculty members |
14 |
18
|
Number of graduate and professional programs ranked in top twenty-five among public institutions |
9 |
14
|
Number of entering students holding national fellowships or scholarships |
6 |
15
|
Number of federally funded training programs |
2 |
6
|
Median time to degree: MA |
3.0 years |
2.0 years
|
Median time to degree: PhD (assumes no MA) |
6.0 years |
5.5 years
|
External research expenditures per faculty member |
$90,000 |
$100,000
|
Extramural research awards |
$186,000,000 |
$220,000,000
|
Number of faculty members serving as fellows in national and international learned societies and academies |
139 |
150
|
Number of faculty articles in refereed journals |
2,154 |
2,400
|
Number of faculty books published |
183 |
200 |
Number of exhibits curated or juried by faculty members |
26 |
35
|
Number of artistic and creative products by faculty members |
770 |
850 |
Number of annual patent applications by faculty members |
23 |
30 |
Number of annual commercial development agreements (options, licenses) |
15 |
19
|
The composition of the committee supposedly limited interdepartmental and intercollegial squabbling. With approval of the deans (who had also suggested professors who might serve on the committee), the provost appointed a prenegotiated number of members from specified schools and colleges and, within those parameters, from different sorts of academic inquiry. The committee also passed internal rules designed to bolster objectivity. For example, committee members could not participate in the evaluation of their own departments.
The provost gave the committee proprietary datasets to use in their evaluations without telling the committee why the administration had selected these specific “products” instead of others. Each dataset was known to be flawed, and roughly 10 percent of the committee’s report was devoted to listing the limitations of the datasets. The National Research Council had not yet released its own highly controversial assessment of the nation’s graduate departments.
Taken together, these datasets were geared toward measuring the speedup. For each program, they provided information on the goals listed in the table above, such as the total number of student credit hours taught per full-time equivalent tenured and tenure-track faculty member, of publications, of citations, of grants in dollars, and of national and international awards. They also addressed doctoral student metrics contained in the academic plan, such as the number of applicants relative to program size, GRE scores of incoming students, and time to degree completion. The committee was not to question those goals, which had been formulated by the president and provost without faculty input. When several of the committee members who worked in the College of Arts and Sciences summarized the report at a college meeting, colleagues asked them about the quality of the datasets. Several committee members seemed defensive about their use of poor data. One suggested that data are never perfect. Another mumbled about triangulation. A third offered the definitive statement on why the committee based its work on the faculty data: “We were charged to do so.”
After the retrenchment committee endured almost a year of distressing work, a dean observed that its report told people what everyone already knew—which departments were better and which worse. Using the tools provided by the provost, the committee had replicated common knowledge. In doing so, it had provided bureaucratic justification for the provost and president to eliminate programs when the budgetary crisis associated with the Great Recession next demanded severe cuts.
One dean’s response to the report of the retrenchment committee illustrated the power of administrators. He felt that some of the supposedly “weak” programs in his college were vital to academic inquiry; but unlike the retrenchment committee, he had the power to affect future results. He instructed those programs to restructure themselves by combining resources and so maximize their pooled achievements. With much at stake, the faculty in those programs put in a great deal of work. The revisions had to be formulated in terms of specific administrative requirements and entered on a series of forms, as the requests for structural reorganization worked their way up a complex bureaucracy—through departmental committees, college committees, the board of trustees’ academic affairs committee, the board itself, and ultimately the state board of higher education. Faculty of the revamped departments happily reported that the anticipated outcome was worth the effort. When next their achievements would be juxtaposed to the speedup contained in the academic plan, an identical publication record would result in a better score on the datasets that the retrenchment committee had been told to use.
The dean’s decision to rescue these small but important programs was admirable; he found wiggle room when too many administrators (and professors) simply do as they are told. But the retrenchment committee could never have tried to protect these programs. It did not even have the leeway to say that some of these fields are vital to academic inquiry.
Responsibility, Realism, and Accountability
Flexibility is central to shared governance. Without it, the faculty resembles a youngster preparing to meet the anticipated reprimand of a stern parent. (“Explain yourself, young man.”) However, unlike the response of the child, the faculty’s account involves not moral responsibility but rather bureaucratic accountability. It is about bookkeeping, where administrators decide the labels on the columns and rows and the faculty get to fill in the blanks.
Administrators increasingly demand data with which to judge individual professors; they require academic departments to invent new procedures intended to please a slew of auditors. The processes associated with student outcomes assessment are an apt example. Though they often feel besieged, professors seem to believe that they must comply with these bureaucratic demands. “If we don’t do what the administration has asked us to do, they will do it for us,” the mantra runs—an attitude that Mary Evans of the London School of Economics has labeled as acquiescence to “coercive realism.” These highly educated women and men forget that, as feminist scholar Audre Lorde put it, “The master’s tools will never dismantle the master’s house.” Like the members of the retrenchment committee, they accede to the accountability regime—the politics of surveillance, control, and market management disguised as scientific administration.
Exacerbated by the Great Recession, the accountability regime speeds up the transformation of the academic world.
An earlier version of this essay,“The Metric Meets the Great Recession: How Universities Are Reviewing Academic Programs,” was presented at the Instituto de la Comunicación e Imagen of the University of Chile in Santiago.
Gaye Tuchman is professor of sociology at the University of Connecticut. Her publications include Wannabe U: Inside the Corporate University and Making News: A Study in the Construction of Reality. Her e-mail address is [email protected].