Educational Technology and the Entrenchment of “Business as Usual”

Higher education’s problematic relationship with massive learning analytics vendors.
By Catherine McGowan, Britt Paris, and Rebecca Reynolds

As campuses closed at the onset of the COVID-19 pandemic in spring 2020, educators and students urgently adapted to remote learning. In the swift transition to online learning, colleges and universities turned to corporate learning-platform vendors but took little time to scrutinize their privacy policies and data-mining practices, further entrenching the problematic relationship between these companies and higher education (and K–12 education). A public health crisis might have driven higher education online in spring 2020, but we need not let this crisis define our future. The botched responses to COVID-19 and related ongoing crises have forced open public debates about how systemic inequity is perpetuated in organizations and institutions. In higher education, this phenomenon is nowhere more apparent than in the uncritical use of online learning platforms. In what follows, we analyze contracted online learning platforms in use at Rutgers University, where we teach, and provide recommendations to guide university decision-makers away from a single-minded focus on the bottom line and harmful perpetuation of the status quo, and toward the project of rebuilding the university to provide a more equitable, just, and effective educational environment.

How We Got Here

In the last twenty years, college and university administrations have come to rely heavily on a range of online web-services platforms to manage integral functions such as student enrollment, student records and transcripts, and tuition and payment status as well as teaching and assessment. Prior to 2010, institutions typically purchased corporate learning management systems (LMSs) as licensed software. Dedicated in-house information technology (IT) staff installed and managed these LMSs on university servers and received tech support as needed from vendors. However, now colleges and universities more commonly purchase licenses under a “software-as-a-service” (SAAS) model that allows corporate vendors to host all content within course modules off site. This model enables vendors to charge higher licensing fees while offering added services such as real-time system upgrades, outsourced troubleshooting and customer service support, and management of many other IT support tasks formerly done in house by university employees.

While such platforms might facilitate innova­tions in administration, teaching, and learning, their frequently uncritical deployment carries substantial risks. For example, the SAAS model raises significant intellectual property concerns about who owns the content and curricula developed by instructors as well as privacy concerns for students and instructors, including concerns about compliance with the Family Educational Rights and Privacy Act (FERPA).

Nearly four years after the COVID-19 crisis began, the ravenous desire for digital technologies to sup­plant the labor of educators continues, particularly as corporate entities and educators alike scramble in the wake of the hype surrounding generative artificial intelligence. In July 2023, the online learning plat­form Instructure and online course provider Khan Academy announced that they had collaborated to develop Khanmigo, a ChatGPT-4-powered tutoring and teaching-assistant tool that can be embedded in Canvas, Instructure’s LMS, to take a “cognitive burden off teachers.” While Instructure’s claims that Khanmigo is capable of “actually teaching” and is a “teaching aid” remain unproven, the true aim of the tool is the further “dashboardization” of education. Learning analytics are touted as tools for making student learning visible, providing evidence of the pro­cess of learning, and making possible “personalized learning,” which Instructure claims is the “technology-powered pot of gold under the rainbow that education leaders and designers and teachers have been chasing for decades.” Meanwhile, Instructure includes a small disclaimer that Khanmigo and its ChatGPT-4 inte­gration, in the words of Khan Academy founder Sal Khan, “can still ‘hallucinate,’ which is the term the industry uses for making stuff up”—concerning for a tool that is marketed as a teaching assistant with the capability to reduce a teacher’s “cognitive burden.”

Administrators negotiate most vendor contracts without any involvement from faculty members, students, or parents and little, if any, accountability to those core constituents. Concern over these dynam­ics parallels concerns in social- and human-service sectors, such as health care, behavioral health, and the gig economy, that rely on online platforms that are little more than data-harvesting technologies; these technologies undermine worker skills and labor power, skirt end-user protections, and are largely ineffective at achieving their stated goals but remain profitable. Similarly, the economic imperatives of learning-platform providers (and higher education institutions) often supplant the interests of end users—learners, their parents, instructors, and administrators. While commercialization of data, including user-generated content and intellectual property, occurs behind the scenes unregulated, senior administrators who license and purchase such systems are often complicit in profit-driven contracts that heavily advantage corpo­rate vendors and force end users to agree to all terms in their entirety to use the service. For a student, not using the service would mean not enrolling in a given class, because use of these platforms is often com­pulsory. Indeed, at Rutgers, as at many colleges and universities, students and instructors are never asked to give consent for their data to be used in LMS plat­forms. The university does so for us, on our behalf, without making this evident or explicit to us.

In a report we prepared for the Rutgers AAUP-AFT, we presented an overview of such issues and analysis of ten different LMSs with which Rutgers is contracted under license agreement, taking into consideration the following categories of analysis: efficacy, profits, privacy, security, and FERPA. Some of the problems we noted at Rutgers are common across higher education, including

  • improper use of dashboards and diagnostic data generated by the LMS to make poor assessments and engage in misguided “data-driven decision-making”;
  • opacity around vendors’ use of data, including curricula, student data and work, and user site metrics;
  • playing fast and loose with FERPA compliance— instructor and student consent are never obtained, company compliance is never verified;
  • terms of service and privacy policies that release the learning platform company from all liability;
  • questionable archiving of and ownership terms for intellectual property, such as student data, student work, and instructor course materials;
  • permissions policies that in some cases may allow the reuse of instructor course materials by the company or other instructors;
  • removal of the instructor’s agency in determining whether to use university-licensed platforms in their own classes, which platforms to use, and how to use them;
  • inequitable terms in licensing contracts that allow companies to extract value from colleges and universities in exchange for largely unproven educational interventions; and
  • further centralization of the power of university administrators and corporate vendors at the expense of students and instructors, who have no say in these contracts.

Our analysis focused on teaching and instruction, assessment, and classroom-management platforms, concluding with a set of analyses and recommendations for the Rutgers AAUP-AFT.

Learning Analytics

“Learning analytics” means different things to dif­ferent people in different contexts, which can cause miscommunication and confusion. We approached our analysis from the perspective of scholarly learning-analytics research, which contributes a substantial base of scientific methods and techniques for analyzing discourse, social networks, sentiments, predictive mod­els, and audiovisual and textual content that center on enhancing learning. But most of these design-based research innovations remain necessarily small-scale, and they are in stark contrast with the basic e-learning course shells used in generic mass online learning in higher education.

Those in the field of learning analytics posit that e-learning systems, at minimum, should improve upon traditional in-person teaching. Small-scale, evidence-based learning-analytics systems are most often designed to support learning in one given problem context in one specific subject domain for one popula­tion of learners. For example, a customized learning technology in a scientific field like physics might be built to assist undergraduates in solving a phys­ics problem. The system might include simulations, gaming elements, or data-mining features to train algorithms to help students with automated prompts or pop-ups that appear at the student’s pace, thereby “scaffolding” the student’s cognition at appropriate moments in problem-solving. The system’s efficacy should be tested and improved. This type of student-centered, rigorous innovation is grounded in scientific learning analytics and offers significant potential for providing insight into vital educational practices. Learning-analytics approaches in P–12 and higher education are already radically changing teaching and learning, and there are positive, evidence-based gains to be made. However, improving learning is not the primary goal of some of the most widely used learn­ing platforms. We must all become more discerning consumers of these online learning products, examining their scholarly grounding and constantly assessing them according to our student-centered educational goals.

Research from George Siemens in 2012 showed how large-scale learning-analytics speculators moti­vated by profit are generally unmoored from the scholarly research base and are rapidly implement­ing subpar, unvalidated solutions—for example, by incorporating untested basic analytics practices, models, and algorithms from business intelligence and emergent “data science” fields. The rapid and uncriti­cal application of untested, methodologically invalid analytics systems (such as the large corporate LMS vendor platforms) in higher education can magnify inequities and compromise the quality of education, adversely affecting students’ educational trajectories and career preparation.

Rutgers as a Case Study

Even as Rutgers rushed to move courses online during the COVID-19 pandemic, the predominantly corpo­rate e-learning technologies in place were inadequate. Moreover, research from Rutgers communications scholars Amy Jordan and Vikki Katz found that more than “two-thirds [of students] . . . hit their smart­phone data cap as courses went remote and just under half had a laptop broken for 10+ days” in the first few months after going online. This suggests that Rutgers’s assumptions about student access to the internet and technology (if indeed much thought went into it at all) ended up exacerbating existing inequity. “Roughly two-thirds of respondents,” Jordan and Katz reported, had “trouble keeping track of deadlines or clearly understanding what is expected of them with remote learning.” This finding points to an unprecedented crisis, and Rutgers should have had a contingency plan. The instructors were not at fault; the problem existed primarily because the large-scale, one-size-fits-all learning technologies offered by corporate vendors were not designed to be functional in the learning scenarios where Rutgers urgently needed (and still needs) them to be.

In this cultural moment that has forced open discussions about entrenched surveillance and control systems and structural inequities, it is important to examine how these structures embed themselves in higher education. Users of mass market, large-scale e-learning-analytics systems are continuously tracked and surveilled in a way that is particularly problem­atic for students of color, student athletes, and certain other groups of students. These systems may also magnify and reward social isolation and atomization in ways that are desirable to university management but disadvantageous for student learning. As students become aware, however vaguely, that they are being tracked, it may have a chilling effect on certain learn­ing processes and advantageous learning outcomes. In this way, the university’s use of learning analytics may exert undue influence over students’ lives and partici­pation in education.

Issues of surveillance are tied to problems of consent and the right to privacy. Institutional interests are often not aligned with student or instructor interests. Instead of working tirelessly to protect the rights of these constituents, the university places the burden on instructors, with little guidance or oversight, to address student privacy issues concerning LMSs in their courses.


FERPA—the federal law that affords parents the right to access to their children’s educational records, the right to seek to have the records amended, and the right to have some control over the disclosure of personally identifiable information from the educational records— is at the heart of the privacy concerns raised by the use of educational technology. When a student turns eighteen years old or enters a postsecondary institution at any age, FERPA rights transfer from the parents to the eligible student. FERPA defines student educational records as including “a range of information about a student that is maintained in schools in any recorded way, such as handwriting, print, computer media, video or audio tape, film, microfilm, and microfiche.”

Under FERPA, a college or university can disclose “directory information”—name, address, telephone number, date and place of birth, and dates of atten­dance—to outside parties without written consent. In this scenario, parents and adult students must be made aware of the type of information about them that is available to outside parties. Personally identifiable information and educational information, which seem nearly indistinguishable from “directory information,” are two other classes that do require written consent.

At Rutgers, when students enroll they and their parents are given an optional form to fill out that allows them to maintain confidentiality and removes the university’s ability to disclose any directory information to third parties, effectively preventing the release of information to outsiders through online directories, commencement materials, transcript requests, and so forth. However, in that form and in accompanying materials, there is no mention of e-learning vendors as a type of third party. Thus, it does not appear that the student confidentiality form prevents third-party e-learning vendors from accessing student information.

Furthermore, while Rutgers provides students with the ability to opt out of allowing their personally identifiable information to be shared with third par­ties, it does not appear to offer students a way not to “opt in” to providing vast amounts of their data, their work products, and their intellectual property to LMS companies. Instructors have no say whatsoever in this matter. If students opt not to fill out the confidentiality form—and most students don’t—their information can be released.

University administrators likely consider e-learning vendors as outside the purview of the FERPA con­siderations the university is obligated to uphold, because of exception criteria—the “school official exception”—in the FERPA law. Under this exception, schools and districts may disclose personally identifi­able information from a student’s educational record to an educational service provider such as an LMS as long as the provider

  1. performs an institutional service or function for which the school or district would otherwise use its own employees;
  2. has been determined to meet the criteria set forth in the school’s or district’s annual notification of FERPA rights for being a school official with a legitimate educational interest in the educational records;
  3. is under the direct control of the school or district with regard to the use and maintenance of educa­tional records; and
  4. uses educational records only for authorized pur­poses and does not redisclose personally identifi­able information from educational records to other parties (unless the provider has specific authoriza­tion from the school or district to do so and it is otherwise permitted by FERPA).

Student consent is not needed for institutions to disclose student information to e-learning vendors under this exception; hence, at Rutgers neither instruc­tors nor students must check any “terms of service” boxes to use the e-learning platforms. However, the lack of clarity regarding whether the e-learning ven­dors are holding up their side of the exception-criteria bargain raises serious concerns. The fourth criterion listed above stipulates that e-learning vendors will use student records and data only for “authorized pur­poses,” which can be interpreted broadly.

In our report to the Rutgers AAUP-AFT, and in a subsequent article in the Journal of the Association of Information Science and Technology, we recom­mended that the union review university contracts with these vendors with a focus on the language per­taining to this fourth criterion. At minimum, it would be reasonable to expect the university to demand specificity in these contracts about constraints on the use of student and instructor data. However, in these scenarios, as Kyle M. L. Jones notes in a 2019 article in the International Journal of Educational Technol­ogy in Higher Education, “it may be that institutions are withholding information about [e-learning vendor] data practices to keep student privacy concerns at bay, concerns that could potentially derail beneficial contracts with vendors.”

For the most part, the university appears to have separated FERPA compliance from LMS platform terms and uses altogether. Instructors are never trained on FERPA, nor have we given consent for our data to be used, been asked to sign any documents having to do with our uses of the platforms, or been required to do more than a cursory review of platforms’ terms of service or privacy policies in order to get access. Instructors seem to be nonentities in these legal con­tractual dynamics.

Overall, while Rutgers and other universities are supposed to protect students’ personally identifiable information from disclosure to third parties under FERPA, senior administrators seem to regard privacy protection as an “all-or-nothing” endeavor, defining as “third parties” only those requesting “directory data”; administrators seem not to view vendors that store and host all of our sensitive data, and that may be harvesting data behind the scenes in ways they don’t transparently disclose, as “third parties.” By neglecting to take a more nuanced, transparent, and democratic approach to privacy disclosure and protec­tion, administrations at Rutgers and elsewhere assert their power to maintain these contracts. One cannot participate in higher education without opting in, but we are never told exactly what we are opting in to. It is altogether unclear how platform providers’ data practices account for and uphold FERPA.

Industry guidelines for maintaining compli­ance with FERPA do exist, but they are guidelines only and appear to have been influenced by lobby­ing interests from the corporate e-learning sector. Organizations like Common Sense have created frameworks and rubrics for evaluating the terms of e-learning providers on privacy counts. These systems are helpful to the public but are not automated, require meticulous scrutiny, and must be frequently revisited as corporate platforms shift their terms of service and privacy policies to be more favorable for their bottom line.


Our analysis of Rutgers’s vendor relationships has yielded five main themes:

  • Policies on data sharing and data management are opaque.
  • FERPA compliance is inconsistent and sometimes ignored, with institutions heavily relying on the student-exception loophole and failing to request or collect meaningful consent.
  • Vendors relieve themselves of liability for privacy breaches through their terms of service and privacy policy agreements.
  • Educator and student rights to intellectual property are highly questionable, and the terms of service frequently claim that reuse of data and content on the platform falls under fair use.
  • Surveillance is insidious and constant, ranging from collection of biometric data through facial recognition and scans of students’ knuckles to tracking of student activity through browser and keystroke monitoring.

To orient Rutgers and other universities toward more thoughtful, evidence-based, equitable, and sustainable use of learning analytics, we need robust intellectual property protections for students and instructors. Institutions should be more transparent in all their dealings with vendors, especially when it comes to student data. Instructors should receive guid­ance and support from the university in developing and embedding student privacy policies in their syl­labi, which should link to existing institutional policies and other relevant privacy resources (such as federal laws like FERPA). Research suggests that we should strive toward solutions that will facilitate student and instructor user groups’ agency, such as establish­ing participatory governance systems involving user groups—staff, students, and instructors. The resultant independent data privacy and protection boards can take an adversarial position in scrutinizing corporate contracts; adopting opt-in, informed consent for all learning-analytics platforms; considering FERPA a “floor, not a ceiling” when it comes to protecting privacy; and taking a proactive and courageous stance to ending state and corporate surveillance of students. First, however, all parties involved in accepting and ratifying terms of contracts must be made aware of the problems discussed in the scholarly literature on e-learning privacy.

Catherine McGowan is a PhD candidate who teaches and critically researches information and data governance in Rutgers University’s School of Communication and Infor­mation. McGowan is a member of the Rutgers AAUP-AFT. Britt Paris is assistant professor of library and information science at Rutgers and an elected member of the Rutgers AAUP-AFT Executive Council. Rebecca Reynolds is an associate professor of library and information science at Rutgers whose research focuses on learning sciences and technologies. All of the authors are Rutgers AAUP-AFT members.