They made a public Rolodex of our faces. How I tried to get out.
They made a public Rolodex of our faces. How I tried to get out.

They made a public Rolodex of our faces. How I tried to get out.

They made a public Rolodex of our faces. Here’s how I tried to get out.

The technology, dubbed Clearview AI, allows law enforcement and private businesses to identify individuals in photographs and videos by comparing them against a vast database of faces scraped from the internet.

But it’s not just law enforcement that is using this technology. Clearview’s clients include a wide range of businesses, from retailers to landlords, and even social media platforms like Facebook.

And the company’s business model relies on scraping faces from the internet without explicit consent. In other words, they are creating a database of people’s faces without their knowledge or permission.

For some people, this is a privacy nightmare. They worry that their faces will be used without their consent, potentially leading to discrimination, harassment, or even violence.

One person who is particularly concerned about Clearview is a New York City resident named Daniel. Daniel has been using Clearview’s technology to track down people who have been using his identity to open credit card accounts.

“I’m scared of being used,” he said. “This technology is really dangerous. If they can track you down like that, they can use that information to target you. You have no idea what someone will do with your image. There’s really no accountability. ”

Daniel decided he needed to get out of this database. He contacted Clearview and requested that his face be removed. But Clearview refused.

Clearview’s response, like their technology, was surprisingly vague.

“We are committed to ensuring that our technology is used ethically and responsibly,” said Clearview CEO Hoan Ton-That in a statement. “We are actively working with regulators and lawmakers to ensure that our technology is used appropriately. ”

Daniel decided he needed to take a more drastic approach. He decided to take on Clearview himself.

He contacted lawyers and journalists, who all seemed to be surprised by the existence of Clearview. Daniel started to realize just how widespread this technology was and how many people were unaware of its potential impact. He found himself as one of many.

He posted about his struggles on Twitter, which attracted thousands of responses. Daniel became the face of a growing movement against the technology, using his story as an example to highlight its potential harm.

“It was really crazy,” he said. “I thought I was just a random guy, trying to protect myself from someone else using my identity. Then I found out that this company had already made a whole database of millions of people’s faces, including my family, my friends, and even my coworkers.” He decided that he was going to take down the database himself, even if he had to take down the company, one face at a time.

Daniel began reaching out to anyone he knew, trying to build an army of his own. “Let’s go make this company understand that the faces they’re storing belong to people,” Daniel said. He quickly found he wasn’t alone. The world had learned the risks of Clearview’s unchecked use of their data.

And he began to take the case directly to Clearview. With an online coalition at his side, Daniel’s requests to Clearview gained a little more attention. For him, however, it felt more personal, a simple attempt to recover some basic control over his life. This wasn’t a crusade for internet privacy; it was a battle against the erosion of his basic autonomy, a battle for his identity.

Daniel eventually managed to contact one of Clearview’s high-level investors and learned the company was about to raise another round of funding.

Knowing they were facing a significant legal challenge, Daniel and the coalition, seeing a vulnerability, had to take the right move. They publicly released a copy of the document and revealed Clearview was planning to partner with governments and police departments. Clearview eventually closed down their operations in some countries, at least temporarily. The public attention eventually forced Clearview to go quiet and abandon their operations, effectively bringing their efforts to a close.

“We showed that Clearview AI can’t function without public consent,” said Daniel, “They only have as much power as they are granted, so when the public sees a problem, they’re willing to act to address it. If the public thinks this type of technology is a problem, they can stop it. They only had to start getting public pushback and their investors ran for the hills.”

This victory against the facial recognition software may not be over. It may well be a story for the history books as new, less visible surveillance technologies emerge, bringing similar risks.

In this time of technological change, however, people like Daniel remain determined. Even against giants, the fight for freedom has found its way to our front steps. His story stands as a reminder that, as individuals and as a society, we are at a critical juncture in our history: to remain complacent and silent, or to engage, organize, and fight.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *