Notice & Comment

The Costs of KOSA?, by Lawrence J. Spiwak

It is widely believed that social media is, at least in part, responsible for the deteriorating mental health of America’s adolescents and teens.  Politicians have taken notice, and in an election season they are keen to score points with voters back home.  Yet, succumbing to the pressure to do something by November risks ill-formed legislation.  Such is the case with the Kids Online Safety Act or “KOSA.”  

The ostensible purpose of KOSA, as the name suggests, is to regulate social media companies to protect America’s youth from the ills of the Internet.  At the time of this writing, KOSA has more than 60 co-sponsors in the Senate, which makes passage in that chamber likely (although nothing is guaranteed in politics).  But as is typical with many pieces of legislation today, rather than take a narrow, targeted, and informed approach, KOSA tries to solve a complex problem with a too-broad approach that ignores unintended consequences. 

Under the current text of the bill, KOSA would first impose upon Internet platforms a duty to “exercise reasonable care in the creation and implementation of any design feature” in order “to prevent and mitigate” an assortment of “harms to minors.”  These “harms to minors” include, but are not limited to, perceived social ills such as anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.  KOSA also focuses on patterns of use that indicate or encourage addiction-like behaviors by minors, along with physical violence, online bullying, and harassment of minors.  Finally, KOSA imposes a duty of care to prevent sexual exploitation and abuse of minors.

Second, to ensure that the platforms are complying with this duty of care, KOSA requires each covered platform to issue an annual public report describing “the reasonably foreseeable risks of harms to minors” and to assess “the prevention and mitigation measures taken to address such risks.”  However, this mandatory report is not something a covered platform can self-produce; this report must be prepared by an independent, third-party auditor conducted through reasonable inspection of the covered platform.

These mandated third-party audits will not be innocuous.  The independent auditor must provide among other items, (1) an assessment of the reasonably foreseeable risk of harms to minors posed by the covered platform, specifically identifying those physical, mental, developmental, or financial harms described above; (2) a description of whether and how the covered platform uses design features that encourage or increase the frequency, time spent, or activity of minors on the covered platform, such as infinite scrolling, auto playing, rewards for time spent on the platform notifications, and other design features that result in compulsive usage of the covered platform by the minor; (3) a description of whether, how, and for what purpose the platform collects or processes categories of personal data that may cause reasonably foreseeable risk of harms to minors; (4) an evaluation of the efficacy of the platform’s safeguards for minors and parental tools, and any issues in delivering such safeguards and the associated parental tools; (5) an evaluation of any other relevant matters of public concern over risk of harms to minors associated with the use of the covered platform; and (6) an assessment of differences in risk of harm to minors across different English and non-English languages and efficacy of safeguards in those languages.

Finally, KOSA will be enforced by the Federal Trade Commission (“FTC”) using its authority under Section 5 of the Federal Trade Commission Act to guard against “unfair or deceptive acts or practices.”  Moreover, KOSA directs the FTC to issue within eighteen months of enactment guidance on how it would enforce KOSA, including how the Agency will determine whether a platform “has knowledge fairly implied on the basis of objective circumstances” that a specific user is a minor.

While protecting kids online may be a worthy social goal, any novel legislation is subject to the Law of Unintended Consequences.  Thus, before KOSA in enacted into law, policymakers need to ask themselves two very important threshold questions:  

First, do the data even support such a massive government intrusion into the market?  The answer to that question is not as clear as the media portrays.  Empirical evidence is decidedly mixed and subject to several valid criticisms. 

In the largest study to date, the Oxford Internet Institute recently examined data from 2.4 million persons in 168 countries between 2005 and 2022, and the study’s authors conclude that the evidence does not support the hypothesis that Internet is “actively promoting or harming either well-being or mental health globally.”  Yet, an impressive study published in the American Economic Review in 2022 found a link between social media use and mental well-being, though it did not focus on adolescents and teens.  

While there are a large number of studies on the link between Internet use and youth mental health, much of it is based on cross-sectional analysis, forcing questions about whether the measured effects are causal in nature.   A recent economic analysis by Phoenix Center Chief Economist Dr. George S. Ford looked at the methods and some data used in literature.   This analysis confirmed that cross-section analysis tends to produce poor measures of the relationship of interest and thus misplaced confidence in the findings of such studies.  The bias in the estimated relationships in these studies may be large or small, may be positive or negative, and vary by the measures of mental health outcomes.  Since most of the studies on such a relationship use cross-section data, Dr. Ford concludes that while “the youth mental health crisis is an important issue, … questions regarding whether there exists a sufficiently robust body of evidence today to justify regulating social media services cannot be ignored.”

Second, even if the data support some sort of intervention, we must then ask if the benefits of KOSA will outweigh its compliance costs?  Several factors suggest “no.”

To begin, as highlighted above, KOSA’s compliance costs are far from innocuous.  Not only will KOSA require covered platforms to hire an army of childhood psychologists to ensure their design features do not promote the myriad of mental illnesses detailed in the statute, but the consulting industry who will be retained to conduct KOSA’s mandatory extensive third-party audits will also be required to do the same.  If the DEI industry is any anecdote, then all the statute will achieve is the creation of a multi-billion-dollar KOSA compliance industry which consumers will ultimately have to pay for.  At minimum, before KOSA moves further down the legislative path, someone should conduct a credible cost/benefit analysis so that policymakers can make an informed—rather than emotional—vote.

Which brings us to enforcement by the Federal Trade Commission.

As also noted above, the FTC is directed to issue within eighteen months of enactment guidance on how it would enforce KOSA, including how the Agency will determine whether a platform “has knowledge fairly implied on the basis of objective circumstances” that a specific user is a minor.  Given this very broad and highly subjective standard, logic dictates that if a firm is going to be heavily penalized for inappropriately targeting minors, then that firm will want to mitigate those risks to the extent practicable.  The easiest way is to ask for some sort of age verification, which—by definition—will require the platform to collect more, not less, personal data about their customers.  This mechanism, by extension, will then raise significant privacy and First Amendment concerns.

More concerning, can the FTC even be trusted to act in a dispassionate and scholarly way when enforcing KOSA?  At least for the foreseeable future, probably not.

While FTC leadership changes with each presidential administration, the FTC under the current leadership of Chair Lina Khan in a hot mess.  To borrow just one quote from the recently released Interim Staff Report by the House Judiciary Committee, “Chair Khan’s radicalism, inexperience, and imprudence squandered the momentum and continues to hamper the ability of the FTC and career federal civil servants to do their jobs well on behalf of the American people. *** [The] FTC under Chair Lina Khan is in chaos.” And with traditional norms at the FTC now shattered, who knows if the next Chair—regardless of who wins the upcoming election—will be able (much less willing) to restore confidence in the once-respected agency.

With questionable empirical support justifying a massive intrusion into the market, KOSA is clearly not ready for prime time.   If legislation is desired, then a more targeted and less burdensome approach should be developed.  

In the meantime, if social media is bad for kids, then parents remain the first and best line of defense.  As Senator John Kennedy of Louisiana recently stated on the Senate Floor, “[n]othing disturbs me more than the notion that a child’s upbringing should be determined by some bureaucrat rather than by the child’s parents. *** It makes me want to throw up.”  

Wise words indeed.

Lawrence J. Spiwak is the President of the Phoenix Center for Advanced Legal & Economic Public Policy Studies (www.phoenix-center.org), a non-profit 501(c)(3) research organization that studies broad public-policy issues related to governance, social and economic conditions, with a particular emphasis on the law and economics of the digital age.