On a recent Friday night, Mark Nelson, a vice principal at a large public school in California, was at a colleague’s birthday party when his cell phone rang with news of a student in acute crisis. The student had gone onto his school district-issued iPad and composed an email to the school counselor, but it wasn’t the counselor who had read the email and called Nelson; it was an employee of Gaggle, a company that creates and sells “products that help schools create safe learning environments” online. Under a contract with Nelson’s school district, Gaggle screens students’ electronic documents, communications, and calendars for things like bullying, violence, and suicidal ideation (as well as less severe transgressions, like swearing). When the student hit send, the message was flagged by Gaggle artificial intelligence, then reviewed by a human screener at the company, who rated it at the highest level of urgency. This triggered the phone call to Nelson, who then called law enforcement to intervene. For this student and for the small handful of other students in crises that school personnel would not otherwise have known about, Nelson has found Gaggle to be a useful tool for getting help to kids who need it.
As schools struggle to catch up with the fast-moving online environment, technology can seem like both the cause of and solution to life’s problems. Increasingly, schools are turning to high-tech surveillance tools to supervise students online. As Nelson, who has worked in education for 20 years (and, full disclosure, whomst I reported to as a teacher in a former position), told The Outline: “There has always been a small proportion of the student body that are going to be jerks or are struggling. With technology, they’re able to [do harm] much more quickly and intensely.”
In many ways, kids today have it much easier than previous generations (when I was in high school, I routinely “forgot” to copy down my Spanish homework so that when it was time to study, I had an excuse to drag the phone into my bedroom and ring up the guy I had a crush on). Teens can seamlessly mix work and fun in the form of school Google for Education accounts that they are required to use for their assignments. (More than 80 million students and educators worldwide use Google apps like Classroom, Docs, and Gmail, according to the company). During class or after school, all they have to do is log in and hop onto Google Docs to chat with their friends at frequencies that adults don’t often pick up on.
But ease of online communication brings with it an immense amount of vulnerability and interpersonal access without the limits of physical presence: Bullying, sexually graphic images, drug sales, threats of mass violence and self-harm all circulate on school-sanctioned technical tools. A number of companies have stepped into this fold to offer parents, teachers, and administrators windows into this world — but this comes at a questionable cost to students’ sense of privacy and agency.
As high-tech and dramatic as Nelson’s scenario was, electronic surveillance of students can be understood as one point existing along a continuum with other ways schools oversee, manage, and direct the behavior of children rather than as a wholly new point of departure. At one end of this continuum are low-tech, not-especially invasive practices — think your grumpy hall monitor, a teacher convincing her students that she has eyes on the back of her head, or sticker charts that track infractions and progress towards goals. These nursery-panopticon practices, which influence behavior by making children feel watched, have existed in schools for forever and, outside of the hippie fringe, are not controversial, in part because they are first-party and in-person.
Apps like Apple Classroom, DyKnow, and ClassDojo extend these common disciplinary practices into online spaces. Apple Classroom and DyKnow, which bills itself as “classroom-management software for teachers,” allow teachers to remotely lock students’ computers or tablets into particular apps in order to cut off distractions and the temptation to cheat. These apps also let teachers call up real-time images of students’ screens and histories of apps each student has used during class to check who has been following instructions and who was off-task.
ClassDojo, which is advertised euphemistically as “the simple way to build an amazing classroom community,” resembles other social media platforms with its scrolling feed of class announcements, reminders, photos, and student work that teachers can post for parents to view at home. At its core, however, ClassDojo is an electronic behavior chart. Instead of flipping a card from green to yellow when a student misbehaves, the teacher taps or clicks on the child’s cute monster avatar and takes away a point—or adds one, if the student did something praiseworthy. The app is immensely popular, and the company reports that it is in active use in 95 percent of K-8 schools in the U.S. (The app is free, and individual teachers can initiate an account without needing the school or district to go through a vetting and adoption process.) Despite its popularity, critics contend that the focus on points and opportunities to compare one’s self to peers (some teachers display Class Dojo publicly) as well as the pervasiveness of behavioral data between school and home can be stressful for children and acclimate them to surveillance. In a recent article in the journal “Learning, Media, & Technology,” Australian scholars Jamie Manolev, Anna Sullivan, and Roger Slee argue that “ClassDojo’s datafying system of school discipline intensifies and normalises the surveillance of students. Furthermore, it creates a culture of performativity and serves as a mechanism for behaviour control.”
Class DoJo, DyKnow, and Apple Classroom all surveil students’ behaviors in the classroom environment. But schools also sometimes reach into spaces generally understood as private, such as backpacks and lockers. The Supreme Court has maintained that students retain their rights — including their right to be free from unreasonable searches — to some degree inside schools, but that those rights are counterbalanced by the school’s need to maintain a safe environment conducive to learning. Exactly where the line is drawn is continually being renegotiated especially as technology — and cultural responses to mass shootings and increases in teen suicides — evolves. Products like Securly and Gaggle, which surveil typically private online spaces like email accounts, documents, private calendars, and search histories and, unlike locker or backpack searches, can involve reaching into the documents and communications that a student creates while at home, sit on the extreme end of the spectrum of ways schools monitor and safeguard students. These companies lean on emotionally charged language in explaining what they do.
Securly dubs itself the “The Student Safety Company,” and promises customers “online protection to kids at school, at home, and in the classroom.” The scare-scrolling copy on Gaggle’s homepage reads: “A Proactive Approach to Suicide Prevention. . . Early Warning Against Violent Threats. . . Safely Eliminate Child Pornography.” Bill McCollugh, the vice president of sales for Gaggle, claimed to The Outline that his company prevented 542 student suicides in the last academic year, by the company’s own estimation (he did not elaborate as to what the company's rubric was to determine this).
Using a combination of AI and human intelligence, Gaggle and Securly scan student writing for signs of bullying, depression, and violent impulses. Minor infractions, like cussing, might result in a form email to a student from the principal reminding them about school policies. Higher-level concerns are emailed to a school administrator, while emergencies result in a phone call. Gaggle also uses a pornography screener to identify images in student files and emails and, when appropriate, takes care the district’s legal requirements to turn over the images to law enforcement without district employees needing to view or handle the images, as in the case of child porn. Securly makes student search histories available to parents and allows parents to impose their own restrictions on their child’s device.
Advocacy groups like the American Civil Liberties Union and the Electronic Frontier Foundation counter that surveillance leads to false identification of students as safety threats, which exposes them to harm, and an erosion of civil and intellectual liberties.
I asked Nelson if he had experienced any pushback by students or parents concerned about student privacy when Gaggle was rolled out at his school. He said that the reception to Gaggle has been positive if muted in his district: “We make it really clear to everyone that when you are on the district device, you are using school property and you should have no expectation of privacy,” he said. When I brought up privacy concerns with Bill McCollough, he likened students using school email to a professional environment in which an employer has the right to look at their employees’ emails. “We’re preparing them for the work world,” he said.
Privacy advocates would likely agree with these sentiments but see them in a different light. As the ACLU puts it on a blog post about electronic surveillance in schools, one of the unintended harms of electronic surveillance in schools is that it “undermin[es] students’ expectation of privacy, which occurs when they know their movements, communications, and associations are being watched and scrutinized.” In other words, it readies them for the future.