As our lives have become more and more online, keeping kids safe in the digital world has become harder. Nowadays big tech and big data are scraping as much info from us as it can and selling it to advertisers. Privacy rights are important for everyone, but especially for young people: they deserve to be able to grow up in a world where their past stays their past and big data isn’t creating a permanent marketing profile for kids at an impressionable age without their consent.
Because our digital world is changing so rapidly, we can no longer rely on laws that were written over 2 decades ago to properly regulate how marketers and tech companies handle children and teens’ data. This is why Restore the Fourth is endorsing the Children and Teens’ Online Privacy Protection Act, an amendment that will bring COPPA—originally drafted in the 1990s—into the modern age.
Key improvements CTOPPA makes to COPPA is:
- Protect teens, as well as children, online.
- Ban targeted advertising for minors.
- Prohibit the collection of data from 13-15 year olds without the users’ consent
- Create an “eraser-button” that will allow minors to delete personal data stored by big tech
📝 Write a letter to your law makers telling them to support it here.
☎️ Call your law makers and urge them to support it at: +1 202-318-3323
Private retailer use of facial recognition technology has been in the news lately, with a story out of Detroit of a young Black girl being ejected from a private business due to a FRT misidentification. Lamya Robinson was kicked out of a roller skating rink after facial recognition technology that the business was using “identified” her as having been part of a fight there before. The only problem: Robinson had never actually been to that skating rink before. Her mother is quoted as saying, “To me, it’s basically racial profiling.” And she’s right: FRT is the same old racial profiling, with a 21st century, high-tech veneer of objectivity.
Tech world biases are baked into the technology itself, technology that is often trained on databases that are primarily filled with white faces. This means that FRT misidentifies Black and brown faces more often than white faces. It’s a fallacy to believe that surveillance such as this guarantees safety. It’s often a question of safety for whom? FRT misidentifications—whether public or private—can lead to dangerous contact between marginalized communities and law enforcement. There’s been multiple cases of black men being wrongfully imprisoned over false FRT identifications. That’s not actually safety, that’s mass criminalization and it harms communities.
Facial recognition technology is inaccurate and unsafe for large portions of our population including, women, LGBTQ people, and people of color. Retailers who use FRT are knowingly choosing to create environments that are not just unwelcoming but also unsafe for marginalized communities. And often times, shoppers have no idea what they’re walking into. Even worse, shoppers may have no choice—consider people who live in food deserts or other communities without many choices of where one can shop. The proliferation of FRT in retail settings will just contribute to mass criminalization of marginalized communities and our already racist policing system. We need to draw a line in the sand here and now: it’s simply not ok for retailers to use facial recognition technology. Sign the petition and shame the naughty list of retailers: here.