Your IP:Unknown

·

Your Status: Unknown

Skip to main content


Can police abuse your data to harass you?

Imagine a police department that could predict crimes before they happen. Steven Spielberg’s sci-fi classic Minority Report explores the risks of such a system, but not everyone got the message. Law enforcement programs are rushing to design crime prediction programs, facial recognition, and digital surveillance. One county’s abuses, however, are illustrating how these technologies can strip us of our basic rights.

Sep 30, 2020

5 min read

police surveillance

Raiding for crimes you have not committed

In 2011, the Pasco County sheriff introduced a new crime prediction program. It works by generating lists of people that might break the law by examining arrest histories and other “unspecified intelligence.”

A computer generates a new list of potential offenders every 90 days, and police analysts then decide who to visit to check if they’re being lawful.

But instead of protecting society and preventing crime, Pasco County deputies started harassing people, visiting them in the middle of the night, interrogating them for no reason, and arresting them for frivolous infractions.

In 2018, 15-year-old Rio Wojtecki was arrested for stealing motorized bicycles and had a juvenile probation officer checking on him regularly. However, Rio ended up on the list of potential criminals and his house was visited by deputies 21 times from September 2019 to January 2020.

A disturbing trend of abuse

Similar crime prediction programs have already been tried elsewhere in the US. A similar program was shut down in the Los Angeles Police Department after it was revealed that half of the people in their database had one or no violent crime arrests.

The Pasco County sheriff’s office claimed that the system only uses criminal data, but refused to release their database or reveal how much intelligence they possess. Considering the Pasco police department’s aggressive tactics, there are many questions left unanswered.

The concept of predictive policing ignores the presumption of innocence. Rather than being “innocent until proven guilty”, people are automatically perceived as potential offenders and may even be pushed towards crimes they would not have committed otherwise.

Scraping photos from social media accounts

In August 2020, the New York City Police Department arrested Black Lives Matter activist Derrick Ingram for allegedly shouting into a police officer’s ear with a bullhorn. Ingram was tracked using facial recognition software and his apartment was surrounded by police officers.

The NYPD claimed that they only compared video footage with lawfully possessed arrest photos for their investigations, but during Ingram’s arrest, one officer was holding a picture taken from his Instagram account.

Earlier this year, a report revealed that the NYPD regularly uses facial recognition software from Clearview AI, a controversial firm known for amassing a database of billions of photos scraped from social media and the web.

Clearview AI collaborates with hundreds of government agencies, companies, and individuals around the world. The FBI, Customs and Border Protection, Interpol, and dozens of police departments use its facial recognition software. Every picture and video you have posted on Facebook, Instagram, YouTube, or anywhere on the internet might have ended up in their database.

There are no strict guidelines for who can gain access to the database, raising concerns about potential abuse. Furthermore, Clearview AI suffered a data breach in 2020 in which its list of customers was accessed. Such an extensive database will definitely attract hackers again, putting our personal information at risk.

Nowhere to hide

Law enforcement are also known to use so-called stingrays — cell tower simulators that trick phones into connecting to a fake tower (stingray) to track your location, intercept communications, or even inject malware.

Any smartphone within the vicinity might connect to the fake tower and have its information extracted. While this technology can help catching wrongdoers, it can also be used to build or be combined with an extensive database accessible to law enforcement or companies like Clearview AI.

Stingrays are routinely used to target suspects in criminal investigations, but many believe that technology was also applied in Black Lives Matter and other protests across the US. Many legal critics also consider stingrays a form of unconstitutional warrantless searching or tracking.

How to protect your privacy

Use end-to-end encryption. Messaging services such as WhatsApp, Signal, or Telegram offer end-to-end encryption. Even if somebody has intercepted your messages, they wouldn’t be able to read them.

Turn off Google location tracking. While this feature can provide you with traffic updates, more useful ads, and recommendations, it also draws a detailed picture of your whereabouts. If you happen to be near a crime scene, you can become a prime suspect in the felonies you haven’t committed.

Be cautious about your posts on social media. Your social media accounts are goldmines for companies like Clearview AI. The more personal details you reveal online, the more likely they will be misused and end up in some secret database without your consent.

Get yourself a VPN. What is a VPN? It masks your IP address and encrypts traffic. With NordVPN, you can avoid surveillance, access your favorite services securely, and take privacy into your own hands.

Like what you’re reading?

Get the latest stories and announcements from NordVPN

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

We won’t spam and you will always have the choice to unsubscribe


3cad6af505ed190de5aeab3d67cb6338 jpg

Karolis Bareckas

Karolis is a tech geek who writes about cybersecurity, online privacy, and the latest gadgets. When not rattling his keyboard, he’s always eager to try a new burrito recipe or explore a new camping spot.