Angwin: You have written that ed tech companies are engaged in a “structural hijacking of education.” What do you mean by this?
Marachi: There has been a slow and steady capture of our educational systems by ed tech firms over the past two decades. The companies have attempted to replace many different practices that we have in education. So, initially, it might have been with curriculum, say a reading or math program, but has grown over the years into wider attempts to extract social, emotional, behavioral, health, and assessment data from students.
What I find troubling is that there hasn’t been more scrutiny of many of the ed tech companies and their data practices. What we have right now can be called “pinky promise
” privacy policies that are not going to protect us. We’re getting into dangerous areas where many of the tech firms are being afforded increased access to the merging of different kinds of data and are actively engaged in the use of “predictive analytics” to try to gauge children’s futures.
Angwin: Can you talk more about the harmful consequences this type of data exploitation could have?
As an example in education, several data platforms market their products as providing “early warning systems” to support students in need, yet these same systems can also set students up for hyper-surveillance and racial profiling
One of the catalysts of my inquiry into data harms happened a few years ago when I was using my university’s learning management system. When reviewing my roster, I hovered the cursor over the name of one of my doctoral students and saw that the platform had marked her with one out of three stars, in effect labeling her as in the “lowest third” of students in the course in engagement. This was both puzzling and disturbing as it was such a false depiction—she was consistently highly engaged and active both in class and in correspondence. But the platform’s metric of page views as engagement made her appear otherwise.
Many tech platforms don’t allow instructors or students to delete such labels or to untether at all from algorithms set to compare students with these rank-based metrics. We need to consider what consequences will result when digital labels follow students throughout their educational paths, what longitudinal data capture will mean for the next generation, and how best to systemically prevent emerging, invisible data harms.
One of the key principles of data privacy is the “right to be forgotten”
—for data to be able to be deleted. Among the most troubling of emerging technologies I’ve seen in education are blockchain digital ID systems that do not allow for data on an individual’s digital ledger to ever be deleted.
Angwin: There is a law that is supposed to protect student privacy, the Family Educational Rights Protection Act (FERPA). Is it providing any protection?
FERPA is intended to protect student data, but unfortunately it’s toothless. While schools that refuse to address FERPA violations may have federal funding withheld from the Department of Education, in practice, this has never happened
The other problem is that with tech platforms as the current backbone of the education system, in order for students to participate in formal education, they are in effect required to relinquish many aspects of their privacy rights. The current situation appears designed to allow ed tech programs to be in “technical compliance” with FERPA by effectively bypassing its intended protections and allowing vast access to student data.
Angwin: What do you think should be done to mitigate existing risks?
Marachi: There needs to be greater awareness that these data vulnerabilities exist, and we should work collectively to prevent data harms. What might this look like? Algorithmic audits and stronger legislative protections. Beyond these strategies, we also need greater scrutiny of the programs that come knocking on education’s door. One of the challenges is that many of these companies have excellent marketing teams that pitch their products with promises to close achievement gaps, support students’ mental health, improve school climate, strengthen social and emotional learning, support workforce readiness, and more. They’ll use the language of equity, access, and student success, issues that as educational leaders, we care about.
Many of these pitches in the end turn out to be what I call equity doublespeak
, or the Theranos-ing of education, meaning there’s a lot of hype without the corresponding delivery on promises. The Hechinger Report has documented numerous examples of high-profile ed tech programs making dubious claims
of the efficacy of their products in the K-12 system. We need to engage in ongoing and independent audits of efficacy, data privacy, and analytic practices of these programs to better serve students in our care.
Angwin: You’ve argued that, at the very least, companies implementing new technologies should follow IRB guidelines for working with human subjects. Could you expand on that?
Yes, Institutional Review Boards (IRBs) review research to ensure ethical protections of human subjects. Academic researchers are required to provide participants with full informed consent about the risks and benefits of research they’d be involved in and to offer the opportunity to opt out at any time without negative consequences. Corporate researchers, it appears, are allowed free rein to conduct behavioral research without any formal disclosure to students or guardians of the potential risks or harms to their interventions, what data they may be collecting, or how they would be using students’ data. We know of numerous risks and harms documented with the use of online remote proctoring systems
, virtual reality
, facial recognition
, and other emerging technologies, but rarely if ever do we see disclosure of these risks in the implementation of these systems.
If corporate researchers in ed tech firms were to be contractually required by partnering public institutions to adhere to basic ethical protections of the human participants involved in their research, it would be a step in the right direction toward data justice