Media, Think Tanks, and Educational Research

Advocacy-oriented think-tank studies get a disproportionate amount of media attention.
By Holly Yettick

The Bunkum Awards are a sort of beauty contest for ugly people. Bestowed by the National Education Policy Center housed at the University of Colorado at Boulder, they reward the most “nonsensical, confusing, and disingenuous” studies of education published each year. Categories include “Damned Lies for Statistical Subterfuge,” “Inferential Longjump,” and “Truthiness in Education.” Contestants are drawn from reports critiqued by the Think Tank Review Project, a five-year-old initiative that provides peer reviews of think-tank reports about education. Though any think tank is eligible, the Bunkum Awards are annually dominated by one subsector: advocacy-oriented think tanks that often value ideology over methodology.

They might be tongue-in-cheek, but these awards are also symbolic of a serious concern among many academics who study K–12 and higher education. Education professors worry that think-tank reports are more widely distributed than peer-reviewed research through the mainstream media and, as a result, have an outsized influence on policy and practice.

Yet surprisingly little research exists on the nexus between the news media and schools and universities, despite the fact that education is among our nation’s most significant public interests and expenses. So, in 2009, the National Education Policy Center commissioned me to look into who was actually conducting the educational research mentioned in the news media.

Educational Research in the Media

For those who feared the Bunkumization of news media accounts of educational research, the study’s results contained both good news and bad. While think tanks by no means dominated the educational research mentioned in the three media outlets I studied—Education Week, the New York Times, and the Washington Post, which were chosen because of their influence among academics, philanthropists, journalists, and others knowledgeable about education policy—any given advocacy-oriented think-tank study had a substantially higher probability of being mentioned than any given academic study.

It is, of course, possible that think-tank studies were more likely to be mentioned because they were more likely than academic research to focus on issues of immediate public concern. My dissertation further explores this issue by examining why journalists would choose to cite one type of educational research over another.

When it came to counting raw numbers of citations rather than the probability of citing a particular study, academic research, not think-tank research, was most frequently mentioned in one of the three outlets studied: the trade publication Education Week. Of 946 educational research citations in 399 articles published in the first six months of 2008, 28 percent mentioned academic research. Government research was most frequently mentioned in the two other outlets studied—the New York Times and the Washington Post. Of 465 articles that cited educational research (945 total research citations) in 2007, 29 percent mentioned government studies. Academic research was second most common among these two mainstream media outlets (22 percent). Government research was the second most common in Education Week (22 percent). Given that several mass-media researchers have found that traditional journalism depends upon and revolves around governmental sources and operations, the prominence of government studies was unsurprising.

Think-tank citations were third most common both in Education Week (15 percent) and in the two daily newspapers (11 percent). Other research affiliations mentioned included university-based policy centers such as the National Education Policy Center itself, associations such as the National School Boards Association, and original research sponsored by media outlets (for example, college rankings by U.S. News & World Report). So, for those who believe that academic and government studies are generally more rigorous than advocacy research, the news was good.

Think-tank educational research did not dominate the publications studied. In fact, academics were getting more ink. Yet, for those troubled by Bunkumization, matters for concern remained, especially when think tanks were disaggregated by their orientation toward advocacy. Nonadvocacy think tanks, such as the RAND Corporation and the American Institutes for Research, are often engaged in evaluation or contract-based work outsourced by other organizations and tend to produce work that is higher quality and much more useful for policy making. The National Education Policy Center’s reviews of reports from nonadvocacy think tanks have identified relatively few methodological concerns. Yet my study found that such think tanks produced just 30 percent of the 139 think-tank research studies cited in Education Week and 17 percent of the 102 think-tank studies cited in the New York Times and the Washington Post. The remaining think-tank citations were to studies produced by advocacy-oriented organizations such as the Manhattan Institute and the Progressive Policy Institute.

In addition to comparing raw numbers of think-tank and academic citations, I compared the probability that the media would mention university research with the probability that they would mention advocacy-oriented think-tank research by calculating the total amount of research produced by each type of organization and the total number of media citations of each type of organization. For think tanks, I counted the number of publications that appeared on each of 104 organizations’ websites in 2007. For academic studies, I both counted publications in all education-related, English-language peer-reviewed journals in 2007 and compiled presentations at the annual conference of the American Educational Research Association.

I found that advocacy-oriented think-tank studies were more likely to be mentioned in the news sources I examined than studies by nonadvocacy think tanks. In 2007, academics produced fourteen to sixteen times more studies than did advocacy-oriented think tanks. Yet academic research was only twice as likely as advocacy-oriented think-tank research to be mentioned in the three outlets studied.

In addition to examining advocacy-oriented think tanks as a group, I broke down the data based upon each organization’s political stance. Here, the research got murkier: it is not always easy to classify think tanks. While an organization such as the Heritage Foundation is clearly conservative and one such as the Economic Policy Institute definitely leans left, other groups are harder to pin down. Particularly difficult to classify are neoliberal organizations such as Education Sector that support many of the same free-market-style educational reforms as libertarians and conservatives yet strongly associate themselves with Democratic Party goals of increasing educational equity. Further, because of the small numbers of think tanks in particular political categories (for example, only four think tanks were classified as “center-left”), a single think tank could have an oversized influence on its category. For this reason, I did not draw any firm conclusions about whether the outlets studied were more likely to represent educational research from left- or right-leaning advocacy-oriented think tanks. However, my findings did suggest one conclusion: all three outlets were more likely to mention reports by think tanks classified as either left or right leaning than they were to mention studies by advocacy organizations without a discernible orientation on the traditional political spectrum. For example, one centrist advocacy-oriented organization included in my study was the Education Trust, which advocates for closing the achievement gap between more and less advantaged students. Such nonclassifiable or centrist organizations produced nearly half of the total advocacy-oriented research. Yet their research comprised just under a third of the mentions of advocacy-oriented think-tank research in the three media outlets I studied.

The Public Profile of Academic Research

Like any other single study, mine is only one piece of the puzzle. Knowing who produces the educational research mentioned in the news media tells us only so much. Studies published in academic forums, such as journals, generally undergo quality control in the form of double-blind peer review. The Think Tank Review Project has repeatedly highlighted troubling flaws in reports by advocacy-oriented think tanks. However, such flaws do not mean that any given example of think-tank research is necessarily shoddy or that academic research is automatically of a high standard. A common response by think tanks to the negative reviews from the National Education Policy Center is to argue that the center’s own research is suspect because the center receives some funding from the Great Lakes Center for Education Research and Practice, which is itself supported in part by the National Education Association, the nation’s largest teachers’ union (other funders include the Ford Foundation).

We do not know why journalists at the outlets studied seem to be more likely to cite any given advocacy-oriented think-tank study over any given academic study. Perhaps it is because advocacy-oriented organizations have the greatest incentive to make and maintain contacts with journalists, since a high media profile may enhance their ability to influence policy and attract funding. Certainly, the tenure structure of academia rewards those who publish in peer-reviewed journals without necessarily encouraging the dissemination of research to the general public. Further, anecdotal evidence suggests that, unlike their science-writer peers, education reporters rarely, if ever, consult peer-reviewed journals.
 
For example, my dissertation research examines educational research in print and online-only media outlets. Though I have so far sorted through nearly forty thousand articles in hundreds of publications, I have yet to come across a single mention of any of the six peer-reviewed education journals published by the American Educational Research Association, the world’s largest academic organization devoted to the study of education.

Additionally, some academic studies focus on technical subjects of little interest to the general public or even the school officials who read Education Week. Think tanks, by contrast, aim to influence policy by focusing on matters of public concern, such as tuition hikes or teacher evaluation. However, the topics of think-tank reports often become matters of public concern because think tanks are savvy about getting their pet subjects into print. As part of my dissertation research, I am interviewing journalists about their articles to understand better how and why some educational studies get mentioned and others ignored. Unfortunately, little current research explores how journalists perceive educational research in particular or social science research in general.

Previous research does, however, suggest that there may be ways to raise the public profile of peer-reviewed research. Here are some recommendations for professors who are interested in increasing mentions of peer-reviewed research in the news media:

  1. Write an op-ed or a letter to the editor. In his 2008 book on the use of charter school research in public debates, Columbia University professor Jeffrey Henig found that newspaper reporters repeatedly quoted the same small group of “experts,” of whom 46 percent were affiliated with universities and 19 percent were associated with think tanks. Of the twelve most frequently quoted experts, nine had written an op-ed or a letter to the editor prior to being interviewed for an article for the first time. In reaching out to the media, they both publicized their research and signaled that they were willing to be interviewed.
  2. Serve as an educator-expert. Journalists generally lack an educational background in science, social science, or scientific methods. According to the most recent comprehensive survey of US journalists, just 11 percent of respondents majored in the social sciences, with more than half of those majoring in political science or government. An additional 3 percent majored in the physical or biological sciences. Not surprisingly, the most common majors are in the communications field. Offer to comment on the methodological soundness of non-peer-reviewed research reports in your field. Often, such reports are embargoed and sent to journalists in advance, which would give you a small window of time to review the study. If you feel qualified to do so, you can also offer to serve as an “off-the-record” resource or even an occasional personal statistics tutor willing to answer questions about research or statistical reports.
  3. Don’t be afraid of the news media. A 2006 survey of epidemiologists and stem-cell researchers found contact between journalists and researchers was more common than previously believed: 70 percent of respondents had engaged in at least one media contact in the past three years, with 30 percent engaging in five or more contacts. A plurality (46 percent) indicated that the contacts enhanced professional advancement, with American and German scientists rating their interactions more highly than their counterparts in the other countries studied (the UK, Japan, and France). The top reasons for initiating contact? Achieving a more positive public attitude toward research and raising the educational level of the general public. In other words, professors are already working toward de-Bunkuming the idea that there is a disconnect between the media and peer-reviewed research.

Holly Yettick is a former education reporter and a doctoral candidate at the University of Colorado at Boulder. This article is based on her 2009 report, The Research That Reaches the Public: Who Produced the Educational Research Mentioned in the News Media? Her e-mail address is [email protected].