Categories
Letter News

Letter to Congress on the Kids Online Safety Act (KOSA)

Today, we join with a group of 90 other LGBTQ+, human rights, civil liberties, and tech policy organizations across the country, to send a coalition letter to Congressional offices opposing S.3663, the Kids Online Safety Act (KOSA). KOSA’s supporters want it to be voted on as part of the final omnibus spending package for the 117th congress. This is a bad idea.

Despite its bipartisan sponsors’ insistence that KOSA is “non-controversial,” it poses a serious threat to youth privacy, safety, and access to information. We’re already gaining news and media traction, but the fight must continue. Please read, and share, the coalition letter below.

Why we need to oppose KOSA

The Kids Online Safety Act (KOSA), led by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), would require platforms and other sites accessed by minors (those 16 and under) to mitigate the risk of emotional and physical harm to young users, and prevent marketing of illegal or harmful content to minors. Its goals are commendable, but it risks severe damage to privacy rights and youth from marginalized groups.

Sec. 10 of KOSA gives additional authority to state attorneys-general, within a context of state politics that is hostile to gay and transgender people. Any state attorney-general may bring a civil action against an internet service provider if they have reason to believe KOSA regulations were violated, weaponizing KOSA in line with contemporary anti-LGBTQ+ politics.

Current right-wing moral panics centered on LGBTQ+ people place heavy emphasis on the role of the internet and social media in forming sexual and gender identity. Take this Heritage Foundation report for example, which argues that KOSA is necessary in order to “solve” a correlation that they see as a pathological, causative relationship between increased online activity and transgender identity. Florida’s recent passage of HB 1557 (the “Don’t Say Gay” bill), the conservative cooptation of the word “groomer” to label LGBTQ+ people as sexual abusers, and the banning of books with LGBTQ+ themes from school libraries are illustrative of such homophobic political attacks at the state level, using broader appeals to children’s safety and wellbeing.

The “Duty of Care” KOSA outlines for online service providers requires platforms to mitigate and prevent “sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material.” While a laudable goal, KOSA’s broad content filtering requirements preclude LGBTQ+ and other youth from accessing helpful online resources, including those relating to sex and relationships. There is a long history in this context of governments trying to over-moderate and over-filter, in a way that would break one of the internet’s chief benefits – finding community with people going through the same kinds of experience in other parts of the world.

KOSA requires platforms to enable parental supervision of minors’ use of their services. Parental controls are important tools to ensure children can navigate the internet safely. However, KOSA’s definition of a “minor” effectively enables parental surveillance of older minors, like 15-to-16-year-olds. Older minors have their own rights to privacy and access to information. Sadly, not every parent-child relationship is constructive or healthy, and KOSA risks subjecting teenagers experiencing domestic violence and parental abuse to additional forms of surveillance. This hurts young peoples’ access to help in situations when they need it most.

KOSA stipulates that those platforms “reasonably likely to be used” by anyone under the age of 17 (virtually all online services) place stringent limits on minors’ use of their services. This includes, but is not limited to, restricting the ability of users to find a minor’s account, and disabling features like notifications that incentivize use of online services. These requirements are fundamentally at odds with everyone’s user privacy rights. Service provides will face strong incentives to implement age-verification techniques to distinguish adults from minor users, which could include submission of more personal information and government issued documents to verify a user’s age, or more widespread deployment of facial recognition. Personal information is highly susceptible to data breaches, and this forced de-anonymization may scare young people away from accessing sensitive information.

Such a controversial bill, with such significant impacts on privacy, ought not to be attached to a huge must-pass omnibus spending bill. Protecting teenagers online should not mean forcing them to de-anonymize, insist that their parents or guardians should be able to see whatever they’re looking at, or incentivize platforms to pre-emptively sterilize their content for fear a teenager might look at it.