Senate investigation warns of harms of digital school surveillance tools, asks FCC to clarify student surveillance rules

register here for The 74’s daily newsletter. Donate here to support The 74’s independent journalism.

Democratic Sens. Elizabeth Warren and Ed Markey are asking the Federal Communications Commission to clarify how schools must monitor students’ online activities, argue in a new report that the widespread use of digital surveillance tools by educators could violate the civil rights of students.

They also want the US Department of Education to start collecting data on the tools that could highlight whether they have disproportionate — and potentially harmful — effects on certain groups of students.

In October, senators asked four educational technology companies that monitor the online activity of millions of students across the country – often 24 hours a day, seven days a week – to provide information on how they use artificial intelligence to glean their information.

Based on their responses, the senators said:

Stay informed.
Invest in independent journalism. And help The 74 make an impact.

Help us reach our Spring Campaign membership goal.

  • Company software can be misused to identify students who violate school disciplinary rules. They cited a recent survey in which 43% of teachers said their schools used surveillance systems for this purpose, potentially increasing contact between police and students and worsening the school-to-jail pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, such as LGBTQ youth.
  • Schools, parents and communities are not properly informed about the use – and potential misuse – of data. Three of the four companies indicated that they do not directly alert students and guardians to their monitoring.

Warren and Markey concluded that “federal action is necessary to protect the civil rights, safety, and privacy of students.”

“While the intent of these products, many of which monitor students’ online activity around the clock, may be to protect student safety, they raise significant privacy and fairness concerns,” the authors wrote. legislators. “Studies have highlighted the unintended but harmful consequences of student activity monitoring software that disproportionately affects vulnerable populations.”

An FCC spokesperson said the agency is reviewing the 14 page report; the Department of Education did not respond to a request for comment.

Lawmakers’ investigation into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first Congressional investigation into student surveillance tools, the use of which has increased dramatically during the pandemic when learning moved online.

It follows The 74’s investigative reporting on Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. The 74 used public records to expose how Gaggle’s algorithm and its hourly-paid workers sift through billions of student submissions each year for references to violence and self-harm, submitting young people to constant digital surveillance with significant implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a 20% increase in activity during the pandemic.

Gaggle and Bark did not respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators’ March 30 report and looks forward to “continuing our dialogue with Senators Warren and Markey on the important topics they raised.” .


Gaggle monitors millions of children in the name of safety. Families targeted say it’s ‘not so smart’

“Parents expect schools to keep children safe while they’re in class, on field trips or traveling by bus,” GoGuardian spokesperson Jeff Gordon said in a statement. a statement. “Schools also have a responsibility to keep students safe in digital spaces and on school-provided devices.”

Bark Technologies CEO Brian Bason wrote in a letter to lawmakers that AI-powered technology could be used to solve the country’s “terrible history of bias in school discipline” by removing decisions from teachers and individual administrators.

“Although any system, including AI-based solutions, inherently has some bias, if properly implemented, AI-based solutions can significantly reduce the bias that students face,” wrote Bason.

As for whether their surveillance exacerbates the school-to-prison pipeline, letters from companies acknowledge in some cases that they contact the police to carry out welfare checks on students. Securly noted in his letter that in some cases, education officials “prefer that we contact public safety agencies directly instead of a district contact.”


New Research: Most Parents and Teachers Have Accepted Student Monitoring as a Safety Tool, But See the Potential for Serious Harm

During the Clinton era Children’s Internet Protection Act, passed in 2000, public schools and libraries are required to screen and monitor students’ Internet use to ensure they are not accessing material “harmful to minors”, such as pornography . Districts have cited the law as a rationale for adopting AI-based surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the scope of the law. Meanwhile, advocates have questioned whether schools’ use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable search and seizure.

In a recent survey conducted by the nonprofit Center for Democracy and Technology, 81% of teachers said they use software to track student computer activity, including blocking obscene content or monitoring their screens. in real time. A majority of parents said they were concerned about student data being shared with the police and more than half of students said they refused to share their “real thoughts or ideas because I know what I online is monitored”.

Elizabeth Laird, the group’s civic tech equity director, said he’s called on student surveillance companies to be more transparent about their business practices, but it’s “disappointing that it took a letter from Congress to obtain this information. She said she hopes the FCC and the Department of Education will adopt the lawmakers’ recommendations.

“None of these companies investigated whether their products were biased against certain groups of students,” she said in an email while questioning their justification for delaying such an investigation. “They cite privacy as the reason for not doing so while simultaneously monitoring students’ messages, documents and sites visited 24 hours a day, seven days a week.”

The 74 survey, which used data on Gaggle’s implementation in Minneapolis public schools, could not identify whether the tool’s algorithm disproportionately targeted black students, who are more often subjected to student discipline than their white classmates. However, it did highlight instances in which keywords such as “gay” and “lesbian” were flagged, potentially subjecting LGBTQ youth to increased scrutiny for discussing their sexual orientation.

Amelia Vance, a lawyer and student privacy expert, said she’s intrigued that companies have pushed back on the idea that their tools are being used to discipline students because the federal oversight requirement was intended to prevent children from consuming inappropriate content online and are likely to suffer the consequences. watch violent or sexually explicit content. She agreed that companies should research their algorithms for potential biases and would benefit from additional transparency.


From detecting face masks to temperature checks, districts have purchased AI surveillance cameras to combat COVID. Why Critics Call Them “Smoke and Mirrors”

However, Vance said in an email that the FCC clarification “would do little at best and could provide counterproductive guidance at worst.” Many schools, she said, are likely to use the tools regardless of federal rules.

“Schools aren’t required to monitor social media, and many have chosen to do so anyway,” said Vance, co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student surveillance requirements, she said.

Asking the FCC to issue guidelines “could actually be counterproductive to the goal of limiting surveillance and ensuring more privacy for students, because it’s possible the FCC could require a higher level of supervision”.

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies:


Subscribe to The 74’s newsletter

Submit a letter to the editor

Norman D. Briggs