Minnesota Wants to Ban Under-18s From User-Generated Content Services

As part of an omnibus bill, the Minnesota House of Representatives passed a troubling bill restricting how under-18 users engage with user-generated content (UGC) services. [At the bottom of this post, I’ve included the text as passed by the Minnesota House] The bill fits the “protect-the-kids” narrative that politicians champion during election years, but it’s counterproductive towards that purported goal. If regulated services can determine who is under-18 (spoiler: they cannot), the bill would likely result in Minnesota minors being excluded entirely from UGC websites–depriving them of the opportunity to build communities and access content they need to grow, learn, and flourish. By freezing under-18 out of large swathes of the Internet, the bill would create a generation of digitally naïve adults–the exact opposite of the skills they, and our society, need to thrive in the 21st century. Furthermore, the age authentication process would expose both under-18s and adults to extra privacy and security risks. So instead of “protecting the kids,” the bill would harm Minnesotans of all ages in countless ways.

I imagine this bill is moving forward only because Minnesota parents don’t realize what the legislature intends to do to their children–and their Internet. (Unlike some other protect-the-kids laws, parents cannot override the law for their children). As they realize the bill’s implications, I’m hoping Minnesota parents will tell their Senators to scrap this effort. Otherwise, given the bill’s obvious unconstitutionality, a court challenge seems inevitable.

What The Bill Says

The bill regulates “social media platforms,” defined as an “electronic medium” that allows “users to create, share, and view user‑generated content.” This includes every service with UGC functionality, whether or not UGC is part of the service’s core offering. The bill excludes “Internet search providers, Internet service providers, or e­mail.” Also, it only applies to social media platforms with more that 1M “account holders” (defined as people who access a “social media account,” an undefined term) “operating in Minnesota.”

The bill prohibits social media platforms from using “social media algorithms” (defined as “software used by social media platforms to (1) prioritize content, and (2) direct the prioritized content to the account holder”) to “target user-generated content at an account holder under the age of 18 and who is located in Minnesota.” In addition to that prohibition–which has whatever default remedies are permitted under Minnesota law–the bill supplements those remedies with a new private right of action for “account holders” (presumably, minors or their parents acting on their behalf) “if the social media platform knew or had reason to know that the individual account holder was under the age of 18 and located in Minnesota.” In addition to other damages, the PRA authorizes statutory damages of $1k per violation, capped at $100k per account holder per calendar year.

The bill’s restrictions do not apply to:

  • “allowing content to appear in a chronological manner” (does this literally mean only chronological order, or would reverse chronological order also be OK? They are not the same thing!)
  • “An algorithm, software, or device that acts as a parental control, or an internal control used by the social media platform that is intended to control the ability of a minor to access content, or is used to filter content for age-appropriate or banned material”
  • “User-generated content that is created by a federal, state, or local government or by a public or private school, college, or university, including software and applications used by a public or private school, college, or university that are created and used for educational purposes”

Analysis of the Bill

The Bill Is Terrible Policy

It appears the bill seeks to stop social media services from using algorithms to recommend other UGC to minors (an interesting choice, because I think more parents are concerned about ads targeting kids). What would UGC services look like if they actually implemented this?

[Note: It’s unclear if the service could honor content subscription requests by minors. For example, could Twitter show a minor only tweets from the accounts that the minor affirmatively follows, or would Twitter have to show the minor all content from all sources? Remember, the law restricts any software that “(1) prioritize content, and (2) direct the prioritized content to the account holder,” and that seemingly includes the software used to honor content subscriptions. It’s also unclear if a service could thread content by topic, or if topical organization also would be an “algorithm.” I’ll assume both of these content restrictions would be OK, but the bill doesn’t actually say that].

The bill would eliminate minors’ access to any algorithmically generated navigation aids, such as “top content” and “hot items” lists. It would also eliminate home pages that are algorithmically organized, whether personalized to individual users or not, which would be replaced by a list of items presented chronologically (presumably in reverse chronological order if allowed because chronological order prioritizes the oldest content, which is rarely what users want). Assuming minors can subscribe to other users’ content, they will have to subscribe sparingly to avoid being overwhelmed by an incoming flow of content unhelpfully organized solely by date. The minors would also have to constantly review the content to avoid missing something important (remember, popular or important content wouldn’t be highlighted, so it could be easily missed). These restrictions would turn UGC services into unusable speech venues that don’t really solve any problem that under-18s want solved.

As bad as this sounds, this scenario isn’t likely to materialize because of a more fundamental problem with the bill. UGC services do not have a good way of segregating adults from minors–a well-known and longstanding problem that remains difficult or impossible to solve. The authentication solutions are error-prone (the PRA has a safe harbor for honest mistakes, but the prohibition does not); they are privacy-invasive, which puts minors (and adults) at greater risk; the verification process must be deployed across all users to segregate minors from adults, which introduces a major barrier in adult usage of UGC services; and verification potentially eliminates “anonymous” activities, which are essential for things like engagement with sensitive topics (e.g. LGBTQ kids trying to form communities) and whistleblowing. So mandatory age verification makes the Internet worse for everyone, adults and minors alike.

After UGC services impose age verification, the law expects UGC services to redirect minors into the dumbed-down version of their services…but why would services do that? Minors are notoriously tricky to monetize; it’s very costly to develop and maintain different codebases; and minors won’t want to use the dumbed-down version the law requires. As a result, the bill’s desired outcome makes no economic sense for most UGC services. Instead, services that do age verification will block Minnesota under-18s from using their services at all. In other words, if services can actually verify ages as the legislature contemplates, this bill won’t really protect Minnesota minors; it will just shrink the Internet for them.

[Note: I’ve often told the story of how Epinions implemented COPPA in 2001. We searched for the under-13s (there were only a few dozen because the service really wasn’t built for minors) and kicked them off the site. Boy, were they pissed. We told them to take it up with Congress.]

So, the only reason to support this bill is because you (a) think all UGC services should verify age, whatever the downstream consequences for adults and for social media, and (b) you want all under-18s blocked from UGC services, regardless of the value that online conversations and communities may play in their growth and development. If you don’t agree with both propositions, then you should vigorously oppose this bill and condemn the legislators who voted yes on it.

The bill also triggers the omnipresent challenge of state-level Internet regulations: how does the service know who is a Minnesota resident? Location-verification technology has evolved better than age-verification technology, but it’s still imperfect, and it still requires deployment across the service’s userbase–meaning that a Minnesota law would change how services engage with non-Minnesota residents. Worse, due to sloppy drafting, the law appears to apply to a platform with 1M users worldwide that “operate” in Minnesota, even if they only have 1 under-18 user in Minnesota. (More on the metrics problem in a moment). For many platforms, the smartest approach would be to block all Minnesota users, regardless of age, rather than build a custom Minnesota-specific version of their service and risk enormous statutory damages if anything goes wrong. As a result, this bill will likely shrink the Internet for all Minnesotans, regardless of age.

The Bill’s Size-Based Metric is Botched

Legislatures routinely struggle to statutorily define Internet service size, so I wrote an article walking them through the issues. I guess the Minnesota drafters missed the article, because their approach is poorly done:

  • 1M account-holders worldwide sweeps thousands of UGC services into the law. 1M Minnesotans (which is NOT what the law says) would be more reasonable (about 1/6 of Minnesota’s population), if we overlook the problem of determining who is a Minnesotan.
  • The term “account-holder” is ambiguously defined as people who “access a social media account.” This implies that unregistered users are “account-holders” because they “access” content posted by other users via their accounts. This interpretation doesn’t make sense, but it’s literally what the bill says. If true, 1M visitors is a *much* lower threshold than 1M account registrations. (Normally, I expect 5% or less of visitors will register accounts, so 1M visitors might translate into 50k registered accounts). If visitors is the metric, the bill lacks a time period for measurement.
  • If the metric is registered accounts, the bill doesn’t define if it’s measuring only active accounts or if it includes dormant, restricted, or terminated accounts. Also, if a UGC service allows posting without an account, does it have zero accounts, or is every contributor an “account holder”?
  • The definition of “social media platform” is “users to create, share, and view user‑generated content,” which describes every service that lets users talk to each other. That picks up services that don’t resemble “social media” at all, including online newspaper comment functions, old-school message boards, anonymous chat services, nonprofit UGC services like Wikipedia, retailers that offer product reviews, etc.
  • As usual, there’s no phase-in period, so a service would have to comply before it actually hits the thresholds to avoid the statutory damages that would instantly attach upon clearing the threshold.

The Bill is Unconstitutional

The bill is poorly conceived and poorly drafted. It’s also likely unconstitutional. The bill makes numerous classifications/distinctions that are likely to trigger strict scrutiny:

  • regulating “user generated content” more restrictively than other types of content, including professionally produced content and advertising. In particular, ad restrictions might only be subject to intermediate scrutiny, but restrictions on editorial content–which the bill affirmatively targets–trigger the highest levels of constitutional protection.
  • regulating “chronological manner” less restrictively than any other content organization method. In effect, this dictates to social media platforms how they can editorially organize content, an obvious no-no under the First Amendment.
  • regulating some types of social media more harshly than “search providers,” “Internet service providers,” and “email.” There are several problems here. Does it give a categorical free pass for all functions operated by search providers and “ISPs,” or only when they are performing the specific functions of search/Internet access? This interpretation question is heightened by the reference to “email,” as opposed to “email service providers,” so how should we interpret the exclusions? And why do these services get a free pass when other services don’t? I’m sure the state could make arguments, but many of their arguments would likely apply equally to all UGC services.
  • preferential treatment for parental controls and government/school-operated content/sites. These preferences probably could survive intermediate scrutiny, but could they survive strict scrutiny?
  • regulating larger social media platforms more restrictively than smaller social media platforms. See this article.
  • forcing national and global services to custom-build solutions for Minnesotans, which may affect the services’ interactions with non-Minnesotans. This is a Dormant Commerce Clause problem. It could also be a First Amendment problem (e.g., its implications for anonymous speech).

A reminder: like this bill, the Communications Decency Act from 1996 required online publishers to use age verification to restrict minors’ access to constitutionally protected speech. The CDA went down in flames in Reno v. ACLU (1997), as did similar “baby-CDA” laws enacted at the state level. Congress tried again in COPA, but it was still unconstitutional. This bill seemingly embraces what made the CDA and COPA unconstitutional, especially the requirement of age verification when the technology doesn’t allow it to be done accurately or safely.

A Final Thought

There are many ways the Minnesota legislature could help children on social media without kicking them off. I made some suggestions here. Looking to enhance the great parts of the Internet, rather than tear it down, seems like a far more productive legislative endeavor than ill-conceived bills like this.

___

The Bill Text

From the Minnesota Journal of the House, May 4, 2022

Sec. 5. [325F.6945] UNLAWFUL SOCIAL MEDIA ACTIVITIES.

Subdivision 1. Definitions. (a) For the purposes of this section, the following terms have the meanings given.

(b) “Account holder” means a person who accesses a social media account through a social media platform.

(c) “Social media algorithm” means the software used by social media platforms to (1) prioritize content, and (2) direct the prioritized content to the account holder.

(d) “Social media platform” means an electronic medium, including a browser-based or application-based interactive computer service, telephone network, or data network, that allows users to create, share, and view user‑generated content. Social media platform does not include Internet search providers, Internet service providers, or e­mail.

(e) “User-generated content” means any content created or shared by an account holder, including without limitation written posts, photographs, graphics, video recordings, or audio recordings.

Subd. 2. Prohibitions; social media algorithm. (a) A social media platform with more than 1,000,000 account holders operating in Minnesota is prohibited from using a social media algorithm to target user-generated content at an account holder under the age of 18 and who is located in Minnesota, except as provided in subdivision 3.   Nothing in this section prohibits a social media platform from allowing content to appear in a chronological manner for an account holder under the age of 18.

(b) The social media platform is liable to an individual account holder who received user-generated content through a social media algorithm while the individual account holder was under the age of 18 and was using the individual account holder’s own account, if the social media platform knew or had reason to know that the individual account holder was under the age of 18 and located in Minnesota. A social media platform subject to this paragraph is liable to the account holder for (1) any general or special damages, (2) a statutory penalty of $1,000 for each violation of this section, provided that no individual account holder may recover more than $100,000 in statutory penalties under this subdivision in any calendar year, and (3) any other penalties available under law.

Subd. 3. Exceptions. (a) An algorithm, software, or device that acts as a parental control, or an internal control used by the social media platform that is intended to control the ability of a minor to access content, or is used to filter content for age-appropriate or banned material, is exempt from this section.

(b) User-generated content that is created by a federal, state, or local government or by a public or private school, college, or university, including software and applications used by a public or private school, college, or university that are created and used for educational purposes, is exempt from this section.