Power

Facial recognition, welcome to U.S. high schools

Just asking questions: Why don’t we try gun control first to stop school shootings?

Power

No one is watching

No one is watching
No one is watching
No one is watching
Power

Facial recognition, welcome to U.S. high schools

Just asking questions: Why don’t we try gun control first to stop school shootings?

A streaming company is offering free facial recognition technology to schools across America, part of a growing network of businesses pitching facial recognition as the solution to school shootings. Many of these technologies are untested, and almost none come with clear guidelines for how the technology should be used, nor indication of how helpful they would actually be.

In an article Wednesday, WIRED interrogated the free facial recognition technology known as SAFR, which is currently being tested in a Seattle school. “We feel like we’re hitting something there can be a social consensus around: that using facial recognition technology to make schools safer is a good thing,” Rob Glaser, founder of the streaming company RealNetworks, which makes the SAFR facial recognition technology, told the magazine.

Although SAFR currently does not scan students’ faces into its system — only adults get that treatment — that could easily change. SAFR’s privacy guidelines are so malleable that if a school wanted to shift the scope of who it tracks to include children, it could. The collected data currently can’t be shared with outside sources, either, but a school district going forward could change that.

Even despite its lack of oversight, SAFR is one of the less exploitative facial recognition technologies that U.S. schools have adopted. A separate system at an afterschool program in Indiana, for instance, scans the faces of everyone — including children — who enters the building. Face-Six, the company behind it, also sells the same technology to prisons and drone manufacturers.

Earlier this year, an upstate New York school district implemented facial tracking technology. Though the district insisted that no one would be permanently remembered by the system unless they are “known” threats — such as sex offenders, fired employees, or expelled students — any students’ data can be stored for up to 60 days. And if a security officer wanted to, they could tell the system to “follow” a student during that period. As one school official explained to Buffalo News, “If we had a student who committed some type of offense against the code of conduct, we can follow that student throughout the day to see maybe who they interacted with.”

With so few guidelines, the opportunities for abuse are myriad. Legal Defense Fund fellow John Cusick told WIRED that schools could use facial recognition technologies to discipline students based on who they decide to befriend. “It could criminalize friendships,” Cusick said. A school in China — which feeds live classroom scans to local police for monitoring — provides a cautionary tale.

Especially for students of color, who are more likely to be deemed uncooperative in school and who are vastly more at risk for police harassment and violence, unchecked facial recognition technologies pose a major risk.

And all of this is despite the fact that facial recognition technology is unlikely to stop a school shooting, given that around 95 percent of violence in schools is committed by currently enrolled students — not sex offenders or fired employees. As University of the District of Columbia law professor Andrew Ferguson put it to the Washington Post last month, “These companies are taking advantage of the genuine fear and almost impotence of parents who want to protect their kids, and they’re selling them surveillance technology at a cost that will do very little to protect them.”