Surveillance

5 minutes, 7 links
From

editione1.0.2

Updated November 2, 2022

You’re reading an excerpt of Making Things Think: How AI and Deep Learning Power the Products We Use, by Giuliano Giacaglia. Purchase the book to support the author and the ad-free Holloway reading experience. You get instant digital access, plus future updates.

If you want to keep a secret, you must also hide it from yourself.George Orwell, 1984*

On a Saturday evening, Ehmet woke up as on any other day and decided to go to the grocery store near his home. But on the way to the store, he was stopped by a police patrol. Through an app that uses face recognition, the police force identified him as one of the few thousand Uyghur that lived in the region. Ehmet was sent to one of the “re-education camps” with more than a million other Uyghur Muslims.*

Even though this seems like a dystopian future, where people are identified by an all-present surveillant state, it is already happening under the Chinese Communist Party. George Orwell’s novel 1984 couldn’t be closer to reality. This scenario is unlikely to happen in other countries, but in this chapter, I go over some companies that are using the power of AI to surveil citizens elsewhere.

One of these companies turning the dystopian version of the future into reality is Clearview AI. Police departments across the United States have been using Clearview AI’s facial recognition tool to identify citizens. In fact, the main immigration enforcement agency in the US, the Department of Justice, and retailers including Best Buy and Macy’s are among the thousands of government entities and companies around the world that have used Clearview AI’s database of billions of photos to identify citizens.*

The company has users at the FBI, Customs and Border Protection (CBP), Interpol, and the New York Police Department.

Clearview’s system works by crawling through the open web for photos of people, creating a database based on those images and combining different photos based on people’s facial features.* It searches for pictures on websites like Facebook, Twitter, LinkedIn, MySpace, and even Tumblr. And it creates an offline database that is updated frequently, joining together all the photos pertaining to a single person.*

Someone at a police department who wants to search for a specific person can use the Clearview AI iPhone app to upload their picture, and the app can return the person’s full name as well as other pictures associated with them.

This tool is not only being used by governmental agencies to identify citizens; it has also been used by private companies to surveil people. Buzzfeed has uncovered through Clearview’s logs that about 2,900 institutions have used the company’s service to search for citizens around the world.*

Unlock expert knowledge.
Learn in depth. Get instant, lifetime access to the entire book. Plus online resources and future updates.
Now Available

In the US and other countries, some law enforcement agencies are even unaware that their officers and employees are using Clearview’s services. It is worrisome that this tool is being used without any oversight.

Audio

ShotSpotter is another tool using machine learning to aid police departments around the world. It has networks of microphones deployed in 110 different communities in the US, including New York City.*

The microphones record and share live audio to a central server. An algorithm then analyzes sounds to identify possible shootings in nearby areas. Police departments using this tool are then notified of the possible shooting with the location of the incident. They can either verify that the sound is actually from a shooting or relabel the sound as something else.*

The technology is not only being used by police departments to figure out if there is a possible shooting, but also by prosecutors as evidence of crimes, even though ShotSpotter hasn’t been fully tested for accuracy.

That is really troublesome as the tool has not been proven to be fully accurate and could falsely label other sounds as shots. The Associated Press has found that ShotSpotter evidence has been used in 200 court cases nationwide. Could this potentially lead to innocent people ending up in jail?

In one such case, court records show that ShotSpotter initially labeled a sound as fireworks. It was then relabeled by a human as a gunshot and used as evidence in a case. Either the human or the machine were wrong. Either case is not great.

Recommendation Algorithms

We’ve all been there. You start watching a video on YouTube. Before you realize it, it’s 1 a.m., and you are watching videos about Greek philosophers and their influence in the modern world.

This is known as the “YouTube rabbit hole”—the process of watching YouTube videos nonstop. Most of these videos are presented by YouTube’s recommendation algorithm, which determines what to suggest you watch based on your and other users’ watch histories.

TikTok, Netflix, Twitter, Facebook, Instagram, Snapchat, and all services that present content have an underlying algorithm that distributes and determines the material presented to users. This is what drives YouTube’s rabbit hole.

You’re reading a preview of an online book. Buy it now for lifetime access to expert knowledge, including future updates.
If you found this post worthwhile, please share!