Allison Butler and Nolan Higdon discuss the “ed-tech” systems eroding minors’ privacy rights and discriminating against students of color.
Any technology created by the U.S. military industrial complex and adopted by the general public was always bound to come with a caveat.
To most, the internet, GPS, touch screen and other ubiquitous technologies are ordinary tools of the modern world. Yet in reality, these technologies serve “dual-uses,”
While they convenience typical people, they also enable the mass coercion, surveillance and control of those very same people at the hands of the corporate and military state.
Nolan Higdon and Allison Butler, authors of Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools, join host Chris Hedges on this episode of The Chris Hedges Report. They explore the software and technology systems employed in K-12 schools and higher education institutions that surveil students, erode minors’ privacy rights and, in the process, discriminate against students of color.
The use of this technology, Higdon explains, is predicated on treating humans as products through surveillance capitalism.
“You extract data and information about humans from all these smart technologies, and then you’re able to make determinations about their behavior, how they might react to something. And there’s a lot of industries that are interested in this,” Higdon tells Hedges.
Butler explains that students, often with no choice in the matter, are subjected to the use of this technology that inherently exploits their data. Because there is an implied consent for it to be used, “The very limited amount of protections that there are to keep minors’ data secure is gone once you have a technology that is put into their classroom,” Butler says. “There’s a passive acceptance of this technology.”
Higdon points to changes made by the Obama administration in 2012 to the Family Educational Rights and Privacy Act (FERPA) as a key factor. These changes allowed for student data to be shared with private companies that serve as educational partners.
“Effectively, all of that data that the students rights movement worked to make sure was private was allowed to be distributed to these companies,” Higdon says.
The authors stress the deep impact these technologies have on the fundamental processes of learning in the classroom.
“It curtails curiosity, which is essential to the education process,” Higdon says. “The mental trauma and difficulty of closing one of the few spaces where they’re able to explore, I think it just speaks to the problem with surveillance and the education process.”
Host: Chris Hedges
Producer: Max Jones
Intro: Diego Ramos
Crew: Diego Ramos, Sofia Menemenlis and Thomas Hedges
Transcript: Diego Ramos
Chris Hedges: Surveillance tools have become ubiquitous in schools and universities. Technologies, promising greater safety and enhanced academic performance, have allowed Gaggle, Securly, Bark, and others to collect detailed data on students. These technologies, however, have not only failed to deliver on their promises, but have eviscerated student privacy.
This is especially true in poor communities, where there is little check on wholesale surveillance. This data is often turned against students, especially the poor and students of color, accelerating the school-to-prison pipeline. When students and teachers know they are being watched and monitored it stifles intellectual debate, any challenging of the dominant narrative and inquiry into abuses of power.
But more ominously, it allows corporations and government agencies to stigmatize and criminalize students. These digital platforms can target the young with propaganda, and use social engineering and trend analysis to shape behavior. Joining me to discuss the mass surveillance of students is Nolan Higdon, author, with Allison Butler, of Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools.
So Allison, let’s begin with you, and as you do in the book, give us a portrait of how intrusive — which I didn’t know until I read your book—how intrusive this surveillance is and how it works.
Allison Butler: Sure. Thank you so much for having us. I would say that the shortest answer to that question is it’s incredibly, wholly, fully intrusive. And to expand on that we live in a world right now of surveillance technologies. There’s pretty much no place you and I can go as individuals where we aren’t in some capacity being surveilled. Much of that is, to a certain degree, by our own choice. For example, if we get into our car and use our GPS, we are agreeing to those terms.
What our concern is, in particular with this text, is that we have an overwhelming amount of surveillance in our K through 12 and higher education schools. Specifically for K through 12 schools, that surveillance is happening for minors, for children under the age of 18, without their active consent. We have kind of been accustomed, we’ve been sort of groomed for surveillance technologies by some of those soft technologies, such as our GPS, such as QR codes that help us make looking at restaurant menus a little bit easier, such as some of the apps on our phone that make our lives feel a bit more convenient.
The cost of that is that when harder and more intrusive surveillance technologies come into our worlds, and for our concern in particular, come into our classrooms, we’ve kind of already had the foundation laid where we’re primed for that, and we might not necessarily question those technologies to the extent that they deserve.
Chris Hedges: Well, those technologies, as you point out in the book, are not marketed as surveillance systems. They’re marketed as enhancements to education, enhancements to security. And just give me a picture of some of those which embark and these other digital surveillance tools, give me a picture of what they are and how they work.
Nolan Higdon: Yeah. Thank you so much for having me, Chris. Allison and I are happy to be here. And I think the easiest way to understand it is, much like the rest of Silicon Valley, these education technology or ed tech companies, they tend to over promise and under deliver. So a lot of the justifications for adding them to classrooms are things people would typically agree with, right? These tools are going to make us more secure or safe. They’re going to improve learning. They’re going to prepare students for the 21st century market. Some of them even advertise themselves as being more inclusive. That is more DEI [diversity, equity and inclusion] compliant, because they take out the human bias, or human element, is what they claim. But we notice in all these cases, they really mask more, I would argue, pernicious motives.
There’s a lot of money to be made by having these tools in the classroom and being able to collect data. So that’s one issue. The other issue is, in addition to masking the real motive, which is making profit, they don’t really deliver on a lot of these promises. We talk about in the book, how, even though they say that these tools are built to promote safety, they often fall short in that. There’s a dearth of evidence to say that they actually improve learning and then there’s a lot of good evidence that they actually work against the goals of DEI. So in a lot of these cases, it seems like the reasons that are given for having these ed tech tools in schools is very different than what they actually do or the real purpose by adding them to a school.
Chris Hedges: Allison, can you explain specifically, like, for instance, Bark or pick one or two of these programs and explain to us what they, of course they’re collecting data, but what do they do within the school setting? What is their function?
Allison Butler: Sure. So one example that’s used in a lot of K through 12 classrooms is a technology called GoGuardian. GoGuardian is put onto computers, classroom laptops. So if you have, for argument’s sake, a classroom where every student has assigned a laptop, it could be their own assignation that they have that particular one for the entire school year or it could be classroom laptops, and it just happens to be where the student is sitting.
GoGuardian monitors their screens, it monitors all of what they’re doing, and then the teacher at the front of the classroom, or wherever they happen to be, can be looking at the student screens during class. One argument that GoGuardian says is that this helps teachers keep students on track, and therefore helps students stay on track. So it’s presented as behavioral. It’s presented as sort of a tool of focus, because teachers can be looking at student screens, and then there’s a kind of a carrot and a stick element to it, which is a teacher can say, hey, you need to get off that website. That’s not what we’re working on.
Or [the teacher] can look directly at a student’s work and comment about what they’re doing well or what might need to be adapted. It’s presented as this sort of community technology for classrooms. Here’s what some of the problems we find with GoGuardian: Teachers are often told that it’s theirs to be reviewing and in fact, many of the teachers that we interviewed for the book said that they believed that they were the ones doing the surveillance, that they were a little bit uncomfortable with that, but they really saw the power of surveillance in their computers. What they aren’t being told or isn’t being made clear is that it’s actually the corporation that’s doing all the surveillance. And if we’re thinking particularly of K through 12 classrooms, as I noted before, this is minors’ data.
So the very limited amount of protections that there are, digital protections, to keep minors data secure is gone once you have a technology that is put into their classroom, there’s a passive acceptance of this technology. The students themselves didn’t give an active acceptance of having their data gathered. It’s an implied consent. And in fact, that’s the language that’s often used, is by using the technology, it’s implied consent, and there isn’t necessarily an opt out. So we have a classroom of confusion where the teacher believes that they’re in charge and that they’re making a particular ethical decision, when in fact, they’re not the ones in charge.
Once something like GoGuardian grabs student data and grabs their information, it has it, there is no kind of off switch to this, and if at a time that a student maybe plugs in their personal cell phone into that device to charge it, GoGuardian now has all of that data as well because of the digital connection of these devices.
One teacher that we interviewed got a little uncomfortable, was telling a story where they were a bit uncomfortable with GoGuardian, in part because the student was home sick and the teacher could still see what was happening on the screen, like even when the student was home, out of school for a legit reason, the teacher could see the student like watching YouTube videos. And that was when she kind of thought, Oh, this isn’t what I thought it was. This isn’t directly connected to the classroom, right? I think sometimes the conveniences of our digital technologies invite us to forget that our digital technologies can be monitored everywhere we are, not just in our classrooms.
I think another example, something that’s used both in K 12 and higher education, would be Turnitin. Turnitin is a program where teachers can set it up so that students submit their written work via this platform, and it sells itself, it presents itself as plagiarism detection, which I suppose, on some level, is true. The other insidious thing is that a lot of these technologies and these corporations don’t ever actually lie to you. They just kind of don’t tell you the whole truth, and they leave out some really important parts. So Turnitin, yes, is a plagiarism detection software. And also, Turnitin does at least two things.
One, it’s teaching AI, right? So the students who are submitting their papers are giving more and more information to the development of generative AI, and then Turnitin also sells that information to advertisers and marketers, so that young people’s language is being analyzed and then sort of used in advertising and marketing language and kind of sold back to them. So our young people are, to some extent, working for this corporation. They are doing a lot of the labor, and they aren’t being compensated in any way. So I’d say those are sort of two really big examples that show kind of both how insidious these technologies are, how invasive they are, and how confusing they can be for those who are encouraged to use them.
“… young people’s language is being analyzed and then sort of used in advertising and marketing language and sold back to them. So our young people are, to some extent, working for this corporation. They are doing a lot of the labor, and they aren’t being compensated in any way.”
Chris Hedges: Nolan, let’s talk about how these technologies are used to police students, especially in poor neighborhoods, which disproportionately affects students of color.
Nolan Higdon: Yeah, one of the things we notice with these tools is that they, again, make these huge promises, right? So they promise things like we’re able to predict if a student can engage in criminality, or we’re able to predict if a student is having mental health issues that need to be addressed.
But the devil is in the details. What these tools do is they collect a lot of data, and they code algorithms to analyze that data, to make determinations about someone’s mental health or potential criminality. And this is really where we see a huge problem with over relying on these tools. These algorithms that are interpreting the data, they’re coded with the same bias of their creators, and we see over and over again, how these algorithms make racist or transphobic conclusions, and what I mean by that is these algorithms will disproportionately categorize students of color as being more likely to commit a crime. As a result, they get monitored more by the school, and this again, normalizes this monitoring of Black bodies.
Ditto with the mental health detectors, they disproportionately categorize things like trans kids for mental health issues, which doesn’t mean they’re just going through mental-health challenges, which is a part of life for some folks, but also means that they need to be watched for things like school shootings or suicide or self harm.
“… these algorithms will disproportionately categorize students of color as being more likely to commit a crime.”
And so you get the over-policing of these individuals as well. And so one of the myths, I think, that Silicon Valley in general, has sold but these ed tech tools in particular, is that they have these objective algorithms that are free of the bias of humans, and thus can can draw more accurate conclusions. But the research says no, that’s not the case, if anything, these tools actually complicate or make worse a lot of the problems we’ve had with issues such as racism or transphobia.
Chris Hedges: Allison, I want to talk about how this data, first of all, ends up in the hands of potential employers. It’s a multi-billion dollar a year industry selling our personal data, everything we’ve ever done, every traffic violation we’ve ever had, because it essentially allows employers, perhaps even universities, who are looking at high school kids, to have information that that should be private, and, of course, could be used against those students or potential employees.
Allison Butler: So I would kind of quibble with one word that you said, which is the selling of our data. I think the thing that we might need to pay more attention to, right, to Nolan’s point about the devil being in the details, and to kind of what I said earlier about how they don’t actually lie to us, they just don’t necessarily tell us everything.
So many of these technologies today will say we don’t sell your data, and it’s sort of a lot of exclamation points, right? And that’s something that we’re supposed to be like, “Oh, okay, good. My data is safe.” Absolutely not. First of all, your data is not safe, because breaches happen so often that it’s not even headline or not even news anymore. At one point in our research, we were trying to kind of categorize or catalog all the different breaches, and we just kind of were like, pointing this out in these microdetails isn’t going to be helpful, because these happen all the time. We’re just so used to it.
But what we really need to consider is that so many of these corporations share our data, right? So what happens is we have what you and I might think of as different corporations that don’t have a connection to each other, and they’ve partnered. They’ve purchased into each other’s business models. They’re connected in some capacity. Then they call themselves educational corporations or educational partners. That means they don’t actually have to sell our data. They can share our data. So we can be reassured on some level. But in fact, it’s this other level that we might need to be thinking more carefully about.
So when we’re talking about employers, or when we’re talking about colleges, or even maybe if we’re talking about private schools, we have so many educational partners that already have access to the data, that can already do some analysis of it, that they are allowed to share it. I think we used to talk several years ago about, particularly with K through 12 students, employers are going to look at your social media. College admissions offices are going to look at your social media. We actually don’t really need to direct our young people to be concerned about these particular lanes anymore. These folks don’t have to do a ton of detective work. It’s already there for them. So whether they’re paying close attention to it or not, I’m not exactly sure, but the data is already there. It’s simply right in front of them.
Chris Hedges: Nolan, can you talk about how they’re proactive, social engineering, how it’s not just collecting data, but it’s using data to shape and mold behavior.
Nolan Higdon: Yeah, and to add a layer to that, to pick up where Allison left off as well. And I’ve even said this a lot today too, these are ed tech companies, but that’s kind of misleading. A lot of these companies that run or own these ed tech platforms, like we talked about a couple, like Gaggle, Bark, there’s Canvas, others, they’re generally owned by equity firms. This really started over the last 10 years, these equity firms bought up these ed tech tools, ostensibly because there was a way to get in the classroom. There’s a whole industry here in the scholarly world they call surveillance capitalism, that is predicated on the idea of treating humans like products. So you extract data and information about humans from all these smart technologies, and then you’re able to make determinations about their behavior, how might they react to something. And there’s a lot of industries that are interested in this, right? Advertising industries would like to know how to create the perfect advertisement to get you to make a purchase of something. Insurance companies would like to know how to set your premiums, maybe based on health or your driving patterns, etc. So data can be quite lucrative to industries for that.
But in addition to predicting behavior, there’s also entities that are interested in nudging your behavior. So what can I do? What situation can I put you in? What information can I give you so you’ll behave like this? And there’s a big belief in the industry if we collect enough data, we can nudge people’s behavior in the right direction, particularly here we are in an election year, a lot of campaigns, that’s what they’re trying to do with this data. How can we use this data to get these people out to vote, or get maybe these people not to vote, depending on things like that? So there’s a lot of potential in the industry if you’re able to collect multiple data points, and that’s why schools are so attractive, right?
They’re one of the few places that have been protected as a public space, and so private companies have long wanted to get in there, and under the altruism of giving ed tech tools, this has been an opportunity for them to get in the classroom and get access to that lucrative data.
“[Schools] are one of the few places that have been protected as a public space, and so private companies have long wanted to get in there and …. get access to lucrative data.”
And just to make a fine point on that, some of these like big firms don’t just own ed tech tools, they own things like Baby Connect, which tells parents to use these tools to monitor their baby. They also own platforms that look at people’s work patterns after graduation. Also get data from social media schooling. The goal is to make a what they call psychographic profile of individuals from the cradle to the grave, and schools are an important part of that process.
Chris Hedges: And I want to be clear, you point this out in the book, this is a massive industry. EdTech, you say, is $123.4 billion global industry. We’re talking about big, big money. Allison, I want to talk about, we’ve watched over the summer as universities and schools have imposed all sorts of new restrictions and rules to shut down protests against the genocide in Palestine, and it’s been coordinated across the country, no flyering, no events, tables, no encampments, et cetera. To what extent do tools like these aid universities and schools in shutting down dissent or controversy, particularly around Palestine?
Allison Butler: I think to a great extent, and I think this is a place where it’s indicative of our universities’ larger fears of, well, probably to be a little bit flippant, our fears of young people and what they’re doing with their own technologies, but fears of the state of academic freedom, fears of what dissent means, of what disobedience means. I think we spend so much time in our classrooms praising historical acts of dissent, historical acts of disobedience but when it’s confronted with us in the present tense, it is somehow terrifying.
If I’m going to give them the benefit of the doubt, it is administrators looking for some way to keep their campuses safe, to keep students, faculty, staff who have differing and conflicting views of safety. Unfortunately, I think the word safe is often used as a please don’t sue me, right? It’s a euphemism for please don’t sue me. So I think out of a cultivated sense of fear that the surveillance technologies do a really good job capitalizing on fear, right? I mean to shift it a little bit, when we think about the start of Covid, it was capitalizing on the fear of what we meant to be together, right? How dangerous it could be for us to be in the same space. And I think these corporations continue to capitalize on that fear when we’re looking at dissent and demonstration and disobedience. And so you have university tools, then you have police state tools, right? That police are coming in with body cameras, which, let’s be honest, can be turned on and turned off to create and frame a very particular narrative, right? This is the thing; surveillance technologies and these tools cut in all directions.
We have our students themselves who are filming their actions, which means there are their faces, right? If they are there in peaceful dissent, if they are there in civil disobedience, their faces are very much there, which means, if something goes wrong, no matter by whom it goes wrong, right? If it’s the police instigation or opposing students instigation, we already have all of their information. So we are living in an environment where I think it is, as through history, important to be present, important to stand up, and also that presence and that standing up is being manipulated and maneuvered in terrible ways by these surveillance technologies.
Chris Hedges: And, Allison, it has a global ramification, because universities and schools traditionally are places where the exchange of ideas and dissent, in a functioning democracy, it’s one of the epicenters, one of the physical epicenters where those kinds of discussions should be allowed to take place.
Allison Butler: Absolutely and I think, again, when we look at history, when we’re looking at the arc of history, we somehow have this picture painted that this was dissent where people were behaving as if they had a civil disagreement, and we just don’t seem to be having that anymore. So our very uncivil disagreements, our very physical disagreements are being painted and presented in a way through these technologies that we probably couldn’t have done in history, right?
I think a lot of talk this summer, when approaching the presidential conventions, both the RNC and the DNC were saying, the DNC in particular, being in Chicago, of like, let’s look back at history. And I think that that’s important. I would never say that we shouldn’t do that, but so much has been shifted in the way our technology is participating in these conventions or these dissents that our understanding of behavior is totally and utterly different.
Chris Hedges: Nolan, this information as you write in the book, doesn’t just end up in the hands of corporations, it ends up in the hands of DHS, the F.B.I. Talk a little bit about how state security also uses this information.
Nolan Higdon: Yeah, the so-called national security, or national security industry, is heavily involved with the collection of this data. And it’s worth reminding folks that these tools — internet, touch screen, GPS, and so many more of the functions of smart devices in the digital age — they come from the military industrial complex. These were created through defense funding, working in collaboration with universities and schools in the middle of the 20th century. And in fact, we talked about in the book, when students found out what these tools were being created to do, they protested against them.
But to this very day, these tools continue to collect data that is shared with DHS and the intelligence community, again, under the auspices of spotting potential terrorists, spotting potential threats. This is problematic for numerous reasons, but one of them, just from a purely education standpoint, it really negatively impacts learning. We’re telling students effectively when they’re on campus, they’re objects to be monitored and to be protected, against and managed. It’s very difficult to develop a trusting relationship where people feel comfortable taking risks and making mistakes, which are central to education when it’s an environment where they’re always being policed.
“It’s very difficult to develop a trusting relationship where people feel comfortable taking risks and making mistakes, which are central to education, when it’s an environment where they’re always being policed.”
Chris Hedges: Well not only policed and monitored, but as we’re watching with the student protests, these monitoring tools, effectively, it’s more than surveillance. It’s about shutting down because they know who’s involved, instantly. I mean, they knew that back in Occupy. I know because I spent a lot of time in Zuccotti [Park in downtown Manhattan], and after [New York City Mayor Mike] Bloomberg shut the park down, there were a series of police raids on lofts and they got all the right people, because they were monitoring them electronically. Allison, I want you to talk about, in particular, two cyber security tools. I want you to talk about Augury and Pegasus.
Allison Butler: Actually … those are kind of Nolan’s babies. So I’m going to turn that back over to him, if that’s okay.
Nolan Higdon: Yeah, Pegasus is basically a piece of spy software that comes from the Israeli government. But Pegasus was basically put into other software. So if you went into other computers, basically, you could monitor people around the globe who had this Pegasus software on there, and it was basically creating a global surveillance platform. And Israel is hardly alone in this. The United States has been…
Chris Hedges: And I just want to interrupt Nolan, Pegasus is an Israeli creation. It comes out of Israel. It was used to track Jamal Khashoggi, I think.
Nolan Higdon: Right, yeah, and the U.S., like I said, is taking part in similar production and monitoring, including working with Israel on Pegasus. But, yeah, this, I think, to Allison’s point about history has changed, we have to talk a lot about our expectations or what our rights and laws have to change as well.
This idea of illegal searches and seizures or the idea that my privacy is something I own, those are changing in the digital era, and the law—and this is one of the things we advocate for in the text—the law needs to catch up with that, because a lot of the protections we had over privacy, loopholes have been exposed by governments and corporations, such as in the Pegasus example. We talk about things like, your First Amendment protects freedom of speech from government, but government can work with tech companies to shut down certain speech or ideas, or you’re protected in your communications, like privately in your home, but if it’s email, that means you’ve given up those communications to a private company that can then give them to government.
So there’s a lot of these type of loopholes that they’re exposing in the digital era, and that’s one of the things we advocate for in the text, because we even had a student’s rights movement that got students the right to privacy in schools. That’s what created FERPA here in the United States. But then in around 2012, the Obama administration changed something to FERPA.
Previously FERPA meant the school couldn’t share a student’s information with anybody. If they were underage, you could share it with their guardian. But the changes to FERPA in 2012 said, “No, you can also share their information with educational partners.” These are companies that have contracts with the schools. And so effectively, all of that data that the students rights movement worked to make sure was private was allowed to be distributed to these companies. And as we’ve seen, that’s what allows it to get into other areas as well.
“Previously FERPA meant the school couldn’t share a student’s information with anybody…But the changes to FERPA in 2012 said, ‘No, you can also share their information with educational partners.’”
Chris Hedges: And talk about Augury. This is developed by a cybersecurity firm, team… What is it, Cymru, which makes massive amounts of data available for government and private customers. Various branches of the military have collectively paid $3.5 million to access Augury’s data.
Nolan Higdon: Yeah, companies such as Augury, I like to think of them as sort of giant data broker repositories. So they go out and get access to massive amounts of data. They analyze this data in real time and the way the industry describes it, they sell basically products that analyze this data for companies or governments.
But Augury is an example of something that serves the interests of governments who maybe want to target people or understand activist behavior or understand activist communication online. Augury promises to have this massive amount of data that it can analyze and provide some answers to questions that governments might have that are seeking to surveil, understand, predict or nudge behavior.
Chris Hedges: Allison you use a term in the book, “algorithmic racism.” Can you explain that?
Allison Butler: So if we think about algorithms, and all of us are kind of, algorithms are sort of, I think our oxygen these days, right? Everything we do digitally is drawn from an algorithm. And algorithms feel, I think, to many, particularly when we’re talking with students in K-12 and to some extent in higher education, they feel like this kind of mysterious thing that somehow in the computer… What we have to remember is that algorithms are, they’re programs, they’re coding, they’re language, they’re questions that are built by humans, so they are built with fallible humans’ racism, sexism, homophobia, ableism, etc, right?
So to our point of algorithmic racism is there is racism baked into these digital technologies, which means, from the get-go, they are going to see people of color, and by extension, women of color, people who have differing abilities, anybody who identifies as LGBTQ, basically anybody who is or identifies outside of what the creator of the algorithm sees as the norm, which means we’re not necessarily looking at physical, tangible lived experiences of racism. We’re looking at racism as a form of teaching us how to use digital technologies, because, as I said, it’s sort of baked into this so the problems are coming to us right away.
Therefore we start to learn how to manage things within that racist frame, and that becomes a norm, and it becomes sort of a centralized way of seeing it, which makes it significantly more dangerous for bodies of color, as well as for those who are interacting with bodies of color to have a preconceived notion built into their technologies of who these bodies are and how they’re expected to behave.
“We’re looking at racism as a form of teaching us how to use digital technologies.”
Chris Hedges: Well, an example of that you pull from the book is facial recognition software in test-proctoring software such as Proctorio, it’s developed for white students. Black and brown students are less detectable and forced to provide more personal information than white students to confirm their identity.
In another example of the racist bias coded into algorithms, research reveals that programs that promise to accurately predict student retention and course success falsely assume that students of color will not succeed. This is because massive amounts of data are needed to train algorithms and AI, but they are trained using inductive logic. So if they are programming to see multiple items but are only shown one result, the algorithm will communicate a variety of different things as only one thing. For example, if the algorithm is programmed to recognize apples but is only shown red apples, the code will see everything in that grouping as a red apple. While this is incorrect in the real world existence of apples, it is correct via what the algorithm was taught.
Allison Butler: The algorithm responds to human input, right? I mean, I think like in the 1980s, when we were sort of first kind of becoming familiar with computers, there was a little catch phrase, “garbage in, garbage out” that if you programmed, I mean, it wasn’t just the regular people in our living rooms programming computers at that point, right? But if you programmed in garbage, then you got garbage. And I think we see this with our generative AI. Any one of us that stumbles or struggles with ChatGPT, for example, maybe what we have to look at is what we’re programming into it. It’s the sophistication that is happening that this is not me kind of clunkily trying to figure out if ChatGPT can make me meal plans for the week so that I don’t have to think that hard when I go to the grocery store. This is highly sophisticated programming that is then framing and constructing how the rest of us view the world, like our example of facial recognition software, and we have a very 21st century example of unearned privilege of being white. That white fits into the model much better.
Chris Hedges: Nolan, I want to talk about migrants. You write in the book, schools, especially universities in the U.S., the U.K. and Australia, are empowered and expected by lawmakers to identify migrant students with a contested or illegal immigration status. The U.S. Department of Homeland Security maintains volumes of data for the purpose of locating and tracking migrants. For example, LexisNexis, which is used widely in education, sells data to the Department of Homeland Security’s Immigration and Customs Enforcement, ICE. LexisNexis, a subsidiary of RELX corporation, offers data analytic products and online databases. It was found to have provided sensitive information to ICE, which ICE was presumably using to track and arrest individuals for deportation.
Nolan Higdon: Yeah, this came from the chapter that was motivated by the fact that every time we talk about this topic, we would always get this question, inevitably from someone who says “Well, so what? I’ve got nothing to hide. Who cares about my privacy?” And in the chapter you’re reading from there, Chris, we tried to lay out a list of different vulnerabilities, and one in particular are students who have contested, or so called, illegal migrant status. They clearly have a reason to want privacy. It may not even be them, they may live with someone who has contested migrant status, who they want to protect through their own privacy as well, but by participating in the learning process where these tools are present, they threaten that migrant status of themselves or the people they live with, it could be used against them for deportation, arrest or anything else.
And we see this over and over again. That’s why we think these tools are so pernicious, because back to where we started this conversation, they’re brought in for things like safety and improved learning and DEI and things I think most people would agree with, but in practice, they’re used against those measures, criminalizing folks, monitoring them, and then using that information, possibly to deport someone.
“a learning process where these tools are present…could be used against them for deportation, arrest or anything else.”
Chris Hedges: You also highlight because school-issued devices can and do alert campuses to students’ web searches, people who are searching about their sexuality and they’ve outed students’ sexual preference. As a result, the LGBTQ+ students try to search for information about their sexuality or sexual curiosity, including health related questions, they risk having themselves outed to school officials, law enforcement and anyone else who can access their information.
Nolan Higdon: Yeah, this goes back to what we were saying, right? We believe education should be an exercise in freedom. Folks should be able to explore who they are, explore information. The expectation is they’re going to make mistakes in the classroom as students, just our teachers, but they need to feel comfortable to be able to make those mistakes. When the idea is that you’re constantly being monitored, or this could come back to your parents, or this could be broadcast to the world, students are less likely to share. They’re less likely to search out that information. It curtails curiosity, which is essential to the education process, and not to mention these folks are wrestling with critical questions about their identity, so the mental trauma and difficulty of closing one of the few spaces where they’re able to explore, I think it just speaks to the problem with surveillance and the education process.
Chris Hedges: Allison, I want to talk about what this does within the schools and within the universities, you write that this constant surveillance is a way to ensure that faculty adhere to the ideological homogeneity sought by school leadership. It begins with the application process when candidates are required to share private details, such as their approaches to teaching and diversity statements, which are used to ensure ideological homogeneity on campus. It continues as students, often illegally, record what teachers do in the classroom. This can and has been used to pressure educators to leave their profession if they are perceived as holding an ideological position that runs counter to the status quo. We’ve watched this since Oct. 7, repeatedly.
Allison Butler: I think that one of the things that these surveillance technologies can do, either intentionally or just by happenstance, is foment environments of mistrust, right? As Nolan has said, as I have said, as we say all the time in our classrooms, schooling classrooms, that’s a place to make mistakes. It’s a place to stumble. It’s this place to be curious. It’s a place where ignorance should be understood in a great way. I walk into a classroom as a student not knowing something. I am ignorant of it. I have the opportunity to learn from it.
When we have an environment where we have all these divisions set up for us via technology, all of that runs the risk of disappearing, disappearing is too soft of a word, of being stomped out, right? I’m not saying that students should have or teachers should have their days filled with horrible, hateful talk, but I think we need to learn so many different perspectives in order to really be fully teaching and fully learning.
And our digital technologies have the capability of recording, that’s a thing that’s important, but they also have the capability of manipulating, which changes the stories that teachers and students are telling. It creates classrooms that, at the very best, are going to run the risk of being boring, okay, but what we’re really talking about is stifling learning. We’re talking about stifling exposure. We’re talking about stifling how to manage difference, right? I think, as we are looking at our global conflicts these days, particularly what’s happening in Israel/Palestine is we are being taught lessons that say difference is bad versus difference is a way of starting to try and learn about each other as human beings.
So when difference, when discussion, when question, when misunderstanding, genuine lack of understanding, is stifled by these digital technologies, our classrooms are no longer places of curiosity or inquiry. They’re factories, just to give us a degree, and that degree might just not mean as much. Again, I’m not advocating in any way, shape or form, for a hate filled classroom, just to kind of like prove a point that things are different, but more about the fact that we should have environments where we get to be uncomfortable, curious, inquisitive, excited, all of the things as a tool of learning, and that we’re doing that together in community, right?
I think another big thing with these surveillance technologies, with, in particular, our social media technologies, is that they’re incredibly isolating. They’re, in fact, quite anti-social. And what school can do, what classrooms can do, what teaching and learning can do, is build collaboration and build community, which counters isolation. It counters that silo-ification and the surveillance technologies are working very hard to build isolation and to build silos.
Chris Hedges: Well, it shuts down any questioning of the dominant narrative, doesn’t it?
Allison Butler: Absolutely, and students don’t necessarily understand or know the structure of the dominant narrative. They don’t necessarily know to question it. We’ve got to start talking about all of this stuff more, and that means being together in classrooms, in our world, being together in classrooms and removing, to the best that we can, these technologies.
Chris Hedges: Well, in any totalitarian system, the point is to deny the ability to ask the questions. Nolan, I want to ask, you said, there’s a well documented history of employers utilizing surveillance to exploit workers and undermine collective bargaining. Rather than view EdTech surveillance as a benefit to the educational process or a safety measure, educators need to recognize that it can be used to data to undermine the power as an employee. Speak about that.
Nolan Higdon: Yeah, this was a really fascinating part of the book, and one that I know we’ve been trying to bring to faculty unions. But yeah, there’s a long history of employers going back centuries, using physical spies or trying to spy on communication to figure out who’s a labor agitator and remove them and anybody that sympathizes with them. This is a whole new era. We have these tools in the classroom which can surveil folks while they’re working the classroom. Theoretically use things, either in context or out of context, to get rid of those employees.
Also, we talked about Israel, Gaza, a lot of employees, faculty right now, they don’t have protections. We’ve seen the adjunctification or at will of higher education. So regardless of how folks feel about that conflict, they avoid it in the classroom because they’re afraid whatever they say can be used against them to lose their job, and that’s not just them losing out, that’s the students losing out an opportunity to engage about a critical world issue. And then moreso, as faculty, these tools are also trying to learn off what we’re doing, so they’re collecting our data and profiting from it without paying us.
Generally the labor movement wants to get paid for labor. But furthermore, they’re also training these tools to try and do some of the functions of what faculty are doing. So you’re training your replacement at some level, and I’m thinking of things like smart grading and smart assignment writing, these new tools that are coming out.
Or there’s some where you can have a image of your face, and you can type and the image will lecture. That’s a way to replace you as the lecture as well. So a lot of these things are coming down the pike, taking away privacy, replacing jobs, and faculty are actually participating in the process by using these tools and not getting strict barriers in their contracts to prevent this type of economic exploitation, not to mention this effort at surveillance to undermine the negotiation process.
“regardless of how folks feel about that conflict, they avoid it in the classroom because they’re afraid whatever they say can be used against them to lose their job, and that’s not just them losing out, that’s the students losing out an opportunity to engage about a critical world issue.”
Chris Hedges: Allison, if left unchecked, I know you end the book with suggestions on how to curb this intrusion into our privacy, but if left unchecked, what kind of an educational and even social and cultural environment are we going to live in?
Allison Butler: I think if left unchecked, we run the risk of living in factory schools, like I said before, that we just sort of push our students through on the way to picking up a piece of paper. We will train future generations that to be monitored is normal, that there is no such thing as privacy. We will have kind of rote types of education where very safe information and a safe way of presenting it, and we will, at least in terms of a digital language, know everything about everybody, with a possible caveat that those who fit into race, gender and economic and physical ability, quote, unquote, norms, will have it easier, but we will start to see all of our bodies that don’t fit, all of our humans that don’t necessarily fit into that, again, very generous, quote, unquote, norm, be moved further and further to the fringes.
I think we could see that in a pretty short amount of time, our classrooms won’t be thought of as places of curiosity, of inquisitiveness. They will be thought of as places of passively accepting very banal, careful information.
At the same time, they’re going to probably look pretty cool, because our surveillance technologies, all of our technologies, are very sophisticated looking. They are cutting edge. They are often hand-me-downs right? As Nolan mentioned before, so much of what we use comes to us from the military industrial complex — that we can drive in unfamiliar places is GPS, is an outdated military technology. That we can take cool pictures of like parties or weddings or real estate is drone technology and outdated military technology. I think the content of our classroom runs the risk of being utterly banal and boring. The look of them could end up being pretty cool, which might be real flashy and invite us to forget to think about the content.
Chris Hedges: Well, when the government watches you 24 hours a day, you can’t use the word liberty, that’s the relationship between a master and a slave. That was Nolan Higdon and Allison Butler on their book, Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools. That was really, really great work you both did. I want to thank Sofia [Menemenlis], Diego [Ramos] Thomas [Hedges] and Max [Jones], who produced the show. You can find me at ChrisHedges.Substack.com.
Chris Hedges is a Pulitzer Prize–winning journalist who was a foreign correspondent for 15 years for The New York Times, where he served as the Middle East bureau chief and Balkan bureau chief for the paper. He previously worked overseas for The Dallas Morning News, The Christian Science Monitor and NPR. He is the host of show “The Chris Hedges Report.”
This article is from Scheerpost.
NOTE TO READERS: There is now no way left for me to continue to write a weekly column for ScheerPost and produce my weekly television show without your help. The walls are closing in, with startling rapidity, on independent journalism, with the elites, including the Democratic Party elites, clamoring for more and more censorship. Please, if you can, sign up at chrishedges.substack.com so I can continue to post my Monday column on ScheerPost and produce my weekly television show, “The Chris Hedges Report.”
This interview is from Scheerpost, for which Chris Hedges writes a regular column. Click here to sign up for email alerts.
Views expressed in this interview may or may not reflect those of Consortium News.
This article has certainly made me more reluctant to persist with using TurnItIn for plagiarism detection in the undergraduate course sections that I instruct moving forward, providing me with a more tangible motive to look sideways at TurnItIn beyond my generalized abstract neo-Luddite sentiments regarding the intrinsically panoptical and Janus-faced nature of all information technology when combined with a human lust for power. Given my omnipresent suspicion of assorted surveillance and marketing technologies, I was naïve not to consider the other unspoken uses that such plagiarism detection software could be put to before reading the transcript of this interview, so I appreciate Chris Hedges and especially Allison Butler for bringing this to my attention.
Thank you, Chris, for bring crucial matters to our attention, every damn day.
Also wondering if you might be interviewing Grisham on his latest NF book.
In my experience, the technology to observe students’ activity on a school’s computer has been around since 2007 at least, using applications like Apple Remote Desktop and Teamviewer. I was shocked to witness this directly in the school’s Tech Support office, particularly since my students were adults, who had access to their email accounts but no idea their communications were not private. Back then the eavesdropping was intended to be passive (albeit potentially prurient), done in the interest of providing efficient support in the case of machine failure. The behavioural “nudging” these authors speak of is not new either. In 2017, we heard reports of data collection being used to nudge voting in the U.S and U.K., by Cambridge Analytica and Aggregate I.Q. respectively. Since then, services such as Newsguard nudge allegedly safe information gathering, and in turn, behaviour. What’s new here is that the data from minors is shared with similar intent, and the same old garbage in, garbage out bias imposed on personalities that are not yet fully formed.
The need for context and thoroughness ar only two of Chris’ good attributes.
Let’s not confuse a right to privacy with a nonexistent right to anonymity in public places. The former is a natural right to privacy in one’s home and other private places and using one’s own tools and belongings. The latter is an expectation of no reaction to living and acting in public places and using public (and other-owned) tools and belongings, which has never existed in human history and cannot be expected, nor is desirable now and in the future.
So in relation to the article, children have a right to privacy when in their own homes if they (1) do not invite corporations (hi Siri) into their home expressly to spy on them and (2) are not using public provided tools (school laptops) provided for a public purpose to do personal things. As for in a public setting like the school room itself or walking around campus or the city in general, actions taken in public are observable by anyone interested in such actions and can be used for data collection, marketing, and any other reason.
as for “I think we could see that in a pretty short amount of time, our classrooms won’t be thought of as places of curiosity, of inquisitiveness. They will be thought of as places of passively accepting very banal, careful information.” … my God, what classrooms have these folks been visiting? Places of inquisitiveness and curiosity? Really? “Bueller … Bueller … Bueller …”
Anything you say can and will be used against you.
Is there really a need for such long article??? As if we had nothing else to read and worry about! The white race has yet to come to realize what a SHIT DISTURBER we are. But always blame another race…of course.