The latest version of the Children’s Online Safety Act (KOSA) aims to remove online information that people need to see, i.e. people of all ages. Letting governments, state or federal, decide what information everyone needs to see is a dangerous business. On top of that, this bill, supposedly designed to protect our privacy, actually forces tech companies to collect After user data than they already do.
EFF has long supported comprehensive privacy protections, but the details matter. KOSA consistently gets the details wrong, and that’s why we’re calling on members of Congress to oppose this bill.
Although KOSA has been overhauled since lawmakers introduced it in February, and slightly improved, it is still a dangerous bill that presents censorship and surveillance as a solution to some legitimate problems and some less legitimate issues facing young Internet users today.
KOSA is a sweeping update to the Children’s Online Privacy Protection Act, also known as COPPA. COPPA is why many websites and platforms ask you to confirm your age, and why many services require their users to be over 13, because laws protecting data privacy are much stricter for children. than for adults. Lawmakers have been hoping to expand COPPA for years, and there have been some good proposals to do so. KOSA, for its part, includes some good ideas: After people should be protected by privacy laws, and the bill expands COPPA protections to include minors under the age of 16. That would do a lot of good, in theory: the more people we can protect under COPPA, the better. But why stop at protecting the privacy of minors under 16? EFF has long supported comprehensive data privacy legislation for all users.
Another good KOSA provision would require sites to allow underage users to delete their account and personal data, and restrict the sharing of their geolocation data, as well as provide a notification if they track them. Again, EFF believes that all users, regardless of age, should have these protections, and expanding them gradually is better than the status quo.
But KOSA’s main goal is not to protect the privacy of young people. The main purpose of the bill is to censor a wide swath of speech in response to concerns that young people are spending too much time on social media and encountering harmful content too often. KOSA requires sites to “prevent and mitigate mental health disorders”, including by promoting or exacerbating “self-harm, suicide, eating disorders and substance use disorders . Don’t get me wrong: this is a requirement that platforms censor content.
This set of content restrictions wouldn’t just apply to Facebook or Instagram. Platforms covered by KOSA include “any online platform that connects to the Internet and is used, or is reasonably likely to be used, by a minor. As we’ve said before, this would likely encompass everything from Apple’s iMessage and Signal to web browsers, messaging apps and VPN software, as well as platforms like Reddit, Facebook and TikTok – platforms forms with vastly different user bases and uses, and with content monitoring capabilities and expectations.
Many online services would thus be forced to make a choice: surfilter to ensure that no one encounters content that could be interpreted as ambiguously harmful, or raise the age limit for users to 17. Many platforms can even do both.
Let’s be clear about the dangerous consequences of KOSA’s censorship. Under its vague standard, adults and children will not be able to access medical and health information online. Indeed, it will be nearly impossible for a website to make case-by-case decisions about which content promotes self-harm or other disorders and which provides the necessary health information and advice to those who suffer from it. This will have a disparate impact on children who lack the family, social, financial or other means to obtain health information elsewhere. (To research showed that a large majority of young people have used the Internet for health-related research.)
Another example: KOSA also requires these services to ensure that young people do not see content that exacerbates a substance use disorder. At first glance, this may seem quite simple: just remove content that talks about drugs or hide it from young people. But how do you find and label such content? Simply put: not all content that talks about drugs exacerbates their use.
There is no realistic way to search and filter just this content without also removing a huge amount of beneficial content. For just one example, social media posts describing how to use naloxone, a drug that can reverse an opioid overdose, could be considered either to promote self-harm, as it can reduce the potential danger of a fatal overdose, or as providing necessary care. health information. But KOSA’s vague standard means a website owner is in a better legal position if they remove that information, avoiding a possible later claim that the information is harmful. This will reduce the online availability of important and potentially life-saving information. KOSA pushes website owners toward government-sanctioned censorship.
The ugly one
To ensure that users are the correct age, KOSA compels extensive data collection efforts that perversely lead to even greater potential invasions of privacy.
KOSA would authorize a federal study on creating an age verification system at the device or operating system level, “including the need for any hardware and software changes.” The end result would likely be an elaborate age verification system run by a third party that maintains a huge database of all internet user data.
Many of the risks of such a program are obvious. They require every user, including children, to hand over private data to a third party just for using a website if that user wants to see past government “parental” controls.
Additionally, the bill lets Congress decide what is appropriate for children to see online. This verification system would make it much more difficult for real parents to make individual choices for their own children. Because it’s so hard to differentiate between minors having discussions on many of these topics in a way that encourages them, as opposed to a way that discourages them, the safest course of action for services under of this bill is to block any discussion and viewing of these subjects by the youngest and teenagers. If KOSA passes, instead of letting parents decide what young people see online, Congress will do it for them.
A recent study on attitudes towards age verification have shown that more parents “are willing to make an exception or allow their child to completely bypass the age requirement, but then demand direct account monitoring or discussions on how to use the app safely.” Many also fudge the numbers a bit, to make sure websites don’t have their children’s specific birthdays. With the national wired age verification system envisioned by KOSA, it will be much more difficult, if not impossible, for parents to decide for themselves which sites and content a young person may encounter. Instead, the algorithm will do it for them.
KOSA also fails to recognize the reality that some parents do not always have their children’s best interests in mind or are unable to make appropriate decisions for them. These children suffer under the paternal regime of KOSA, which requires services to establish parental control at the highest level for those under the age of thirteen.
KOSA is a poor substitute for true online privacy
KOSA’s attempt to improve privacy and security will actually have negative impacts on both. Instead of using superpowered age verification to determine who gets the most privacy, and then using that same determination to restrict access to massive amounts of content, Congress should focus on creating strict guarantees of confidentiality for everyone. Strong privacy protections that prohibit the collection of data without voluntary consent address concerns about children’s privacy while making age verification unnecessary. Congress should take privacy seriously and pass legislation that creates a strong, comprehensive privacy foundation with robust enforcement tools.