FTC Opens a Backdoor Route to Age Verification on Social Media

I hadn't heard of the app NGL until recently. But that's not surprising. The anonymous questions app seems to be largely popular among teens.

Bark, the maker of parental content-monitoring software, calls NGL "a recipe for drama" and cyberbullying. But it seems like a fairly standard social media offering, allowing users to post questions or prompts and receive anonymous responses.

Now, the Federal Trade Commission (FTC) has ordered NGL to ban users under age 18.

That Slippery Slope Again

The FTC and the Los Angeles District Attorney's Office say NGL "unfairly" marketed the app to minors. "NGL marketed its app to kids and teens despite knowing that it was exposing them to cyberbullying and harassment," FTC Chair Lina M. Khan said.

To settle the lawsuit, the agency is not only making NGL pay $5 million, it's also requiring the app to ban those under age 18 from using it.

This seems to me like a worrying development.

An administrative agency ordering a social media app to ban minors is effectively a backdoor way to accomplish what Congress has been failing to mandate legislatively and what courts have been rejecting when state lawmakers do it.

Granted, the FTC does not seem to be requiring NGL to check IDs. It's merely "required to implement a neutral age gate that prevents new and current users from accessing the app if they indicate that they are under 18," per the FTC's press release.

But this is still the FTC setting minimum age requirements for some social media use, circumventing both parental and legislative authority.

Besides, it doesn't seem like a long shot from here to either a) punishing the company further if kids lie about their ages, thereby necessitating the use of ID checks or other age verification schemes by NGL, and/or b) requiring more invasive age verification schemes in future orders to social media companies.

The FTC's Case Against NGL Is Littered With Anti-Tech Tropes

I can't speak to the accuracy of all of the FTC's claims about NGL, which include allegations that it "falsely claimed that its AI content moderation program filtered out cyberbullying and other harmful messages" and that it "tricked users into signing up for their paid subscription by falsely promising that doing so would reveal the identity of the senders of messages."

In a statement posted to the NGL blog, the company said it spent two years "cooperating with the FTC's investigation" and that "many of the allegations around the youth of our user base are factually incorrect."

Knowing the way the FTC tends to distort descriptions of tech company action, I'm somewhat skeptical of the FTC's claims about NGL to begin with. And there are a lot of red flags in the publicity around this case.

A lot of authorities' publicity focuses not on unfair or deceptive practices by NGL but on the underlying function of the app. For instance: "The anonymity provided by the app can facilitate rampant cyberbullying among teens, causing untold harm to our young people," Los Angeles District Attorney George Gascón said in a statement.

"NGL and its operators aggressively marketed its service to children and teens even though they were aware of the dangers of cyberbullying on anonymous messaging apps," the FTC said.

Of course, plenty of apps allow for anonymity. That this has the potential to lead to bullying can't be grounds for government action.

But a common trope of government officials attacking tech companies is suggesting that the company is in the wrong because it should have known people could use the app in unwanted or unkind ways. It's an easy way to declare basically any online platform guilty, since virtually all forms of open online communication are morally neutral and multifaceted, capable of facilitating very positive interactions, very negative interactions, and everything in between.

The FTC also trots out other well-worn anti-tech tactics, such as faulting the company for failing to do content moderation perfectly (NGL said it "would filter out cyberbullying and other harmful messages" but "failed to prevent rampant cyberbullying and threats") and pointing to isolated and unverifiable instances of trouble to bolster its case ("one consumer reported that their friend had attempted suicide because of the NGL app").

Perhaps NGL wasn't a model of tech company transparency and integrity. But the FTC's actions here have all the hallmarks of anti-tech animosity and moral panic about young people's use of technology being used to justify disturbing overreach.

A 'Novel' Case

FTC Commissioner Andrew N. Ferguson admits that the FTC's actions here are unusual.

They are based in part on a "novel theory," Ferguson said in a statement joined by Commissioner Melissa Holyoak. This theory says NGL violated Section 5 of the Federal Trade Commission Act "by marketing an anonymous messaging app to children and teenagers despite knowing that anonymous messaging apps are harmful to these groups."

Note that this isn't framed as particular NGL actions being violations. It's merely the fact that it marketed to minors at all while being the kind of app it was.

Ferguson said he voted to approve this complaint because he agreed "that it was unfair to market this anonymous messaging app to teenagers in the way that the defendants marketed it." More:

If the allegations in the complaint are true, NGL sent fake, anonymous, and distressing messages to minors specifically designed to make them doubt their social worth, as part of a fraudulent scheme to convince those minors to pay for the ability to see who sent the messages. This alleged conduct, tailormade to manipulate the vulnerable teenage psyche, was reprehensible and unfair.

However, Ferguson wanted "to make clear…that it does not follow that Section 5 categorically prohibits marketing any anonymous messaging app to teenagers."

That's an important distinction—but not one that all of his colleagues felt was worth making.

More Sex & Tech News

• The U.S. 5th Circuit Court of Appeals will reconsider a case involving the Llano County, Texas, library removing 17 books—including It's Perfectly Normal: Changing Bodies, Growing Up, Sex and Sexual Health—over subject-matter concerns. "While the library patrons say removing the books constitutes an illegal government squelching of viewpoints, county officials have argued that they have broad authority to decide which books belong on library shelves and that those decisions are a form of constitutionally protected government speech," notes PBS. A three-judge panel of the 5th Circuit held in June that nine of those books could be removed but eight had to stay. The county requested a rehearing before the full court, which has now been granted.

• Arkansas' secretary of state has rejected an abortion rights initiative submitted last week by Arkansans for Limited Government. The group said it submitted 101,525 signatures, well above the 90,704 threshold required to get it on the state's ballot. But according to Secretary of State John Thurston, the group failed to give required booklets to some paid canvassers and failed to submit paperwork about their identities, resulting in 14,143 of the collected signatures being deemed invalid.

Today's Image

The post FTC Opens a Backdoor Route to Age Verification on Social Media appeared first on Reason.com.