Your IP:Unknown

·

Your Status: Unknown

Skip to main content


Gaggle: A life-saving tool or a privacy nightmare?

While Google monitors our online actions for marketing purposes, Gaggle aims to replace our schools' social workers and mental health specialists. Would you like an AI-powered application to be your mentor and counselor? Is Gaggle a life-saving tool or just another threat to our privacy?

Dec 23, 2021

4 min read

kids safety

What is Gaggle

Gaggle is AI-powered school-monitoring software that scans students' communication from school-issued Google and Microsoft accounts. Its primary purpose is to inform school authorities in a timely manner if there is a threat to students' physical or mental well-being and protect them from harmful content. Its ultimate goal is to prevent possible tragic events.

Gaggle immediately alerts school officials when keywords of concern appear in its systems (e.g., it indicates evidence of illicit behavior, strong language, potential cyberbullying, and mentions of self-harm). The presence of such data potentially allows school authorities to intervene and mitigate possible consequences.

Many schools pay big money for its services. Gaggle monitors around 5 million students in the US. While it can potentially "stop tragedies with its real-time content analysis," it introduces an unprecedentedly large scale of school monitoring and raises privacy concerns. It deals with particularly sensitive issues such as mental health and could potentially expose them to the public.

How Gaggle works

Gaggle scans all the students' Microsoft and Google content, including Slides, Spreadsheets, and Hangouts, and looks for keywords that could indicate possible problems. This job is done by artificial intelligence and moderators, who immediately notify the authorities if they find something suspicious. Commonly flagged phrases include such words as "suicide," "heroin," "kill you," or "kill myself.”

Gaggle also categorizes phrases according to the nature of severity. Its safety management team determines the rubric where the content falls and takes appropriate actions. It could either simply delete the content, give warnings to students, or in more severe cases (e.g., when there is a threat to a person’s physical or mental well-being), alert the school’s authorities.

Potential threats

Qualifications of employees

The lack of transparency in Gaggle’s mode of operation and the qualifications of its staff raise questions. While the company boasts that its staff members "have degrees in fields such as criminal justice, psychology, communications, health administration, and education," it’s unclear whether this information applies to all levels of employees. We also know little about the training and certifications Gaggle provides to its staff.

What causes more concern is the fact that Gaggle’s job ads for Level 1 employees don’t ask for experience in mental health-related areas or young-adult counseling, but the applicants need to know social media slang. It’s also unclear whether they receive any benefits or insurance. These factors raise concerns about whether such sensitive data is handled properly.

Privacy violations

While Gaggle claims that it has saved hundreds of lives from suicide and flagged 722 very specific suicide-related references, it is difficult to check if Gaggle has played a substantial role in such prevention. Its critics claim its acute surveillance is not justified and is harmful to students. Parents and students have also expressed concern over third-party monitoring of such sensitive and private issues.

Data leaks

As with most services, Gaggle is not immune to experiencing a data leak. The consequences could be tragic if such sensitive data falls into the wrong hands and is exposed to the public. Keeping in mind that the world faces tons of data breaches each day, this possibility is not unrealistic.

Possible inaccuracies

While Gaggle’s hiring procedures raise questions, its usage of artificial intelligence could also lead to inaccuracies that would cause additional distress for the people involved. AI algorithms are not perfect and can misinterpret phrases without evaluating their context.

Future implications

For tools like Gaggle to work as they should, schools should always cooperate with trained mental health specialists and clinicians to provide students with professional help. Moreover, Gaggle should only hire highly trained staff and provide them with the necessary conditions to do their job properly.

At the moment, Gaggle’s processes are not clearly defined, and it is still an open question whether the tool does more harm than good.

Like what you’re reading?

Get the latest stories and announcements from NordVPN

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

We won’t spam and you will always have the choice to unsubscribe


author paulius 1 png

Paulius Ilevičius

Paulius Ilevičius is a technology and art enthusiast who is always eager to explore the most up-to-date issues in cybersec and internet freedom. He is always in search for new and unexplored angles to share with his readers.