This week the Federal Trade Commission unveiled hefty settlements with Epic Games Inc.—the creator of the video game Fortnite—to resolve separate actions alleging violations of Section 5 of the FTC Act and the Children’s Online Privacy Protection Act (COPPA), respectively.

Epic Games will pay $245 million in consumer redress to settle the alleged Section 5 violations in an FTC administrative proceeding and will pay $275 million in monetary penalties to settle the COPPA action in federal court. The cases highlight two hot spots for the FTC—dark patterns and children’s privacy.

In its administrative complaint, the FTC alleges that Epic Games used dark patterns, making the gameplay interface confusing and tricking players into making in-game purchases, often when they did not intend to. Specifically, the complaint alleges that:

Continue Reading Ready, Aim, Fire: FTC Scores Record-Breaking $520 Million Settlement with Fortnite Creator Epic Games

At a Federal Trade Commission (FTC) event last week, Chair Lina Khan said children are more susceptible than adults to deceptive or harmful practices, especially those that blur the line between advertising and entertainment.

The event, “Protecting Kids from Stealth Advertising in Digital Media,” included legal and child development experts, researchers, members of industry, and consumer advocacy groups. Together they discussed children’s development and ability to detect and understand advertising, the potential harms to children from blurred, deceptive, or manipulative advertising practices as well as the ways to mitigate them, and the significance of effective disclosures.

In her opening remarks, Khan said children often are unable to understand the difference between advertisements and organic content. Without realizing it they may end up engaging in commercial transactions or provide companies with their personal information without comprehending the privacy risks. Khan also noted that the FTC is considering whether to update its Children’s Online Privacy Protection Act (COPPA) Rule, which has not been updated since 2013, and requested comments on its advanced notice of proposed rulemaking related to commercial surveillance.

Continue Reading FTC to Digital Media Advertisers: It’s Time to Protect Kids

We have written previously about the FTC’s vigorous enforcement efforts relating to negative option marketing and its crackdown on alleged wrongdoing seeking to exploit the difficulties presented by COVID-19 (see blog posts here and here). Recently, the FTC continued its efforts with a complaint and settlement concerning negative option marketing to parents seeking online educational resources for their children.

On September 1, 2020, the FTC brought a complaint against online children’s education company Age of Learning, Inc., d/b/a as ABCmouse, alleging that it operated a deceptive negative option program between 2015 and 2018. The FTC alleged that ABCmouse’s actions violated both the FTC Act and the Restore Online Shoppers’ Confidence Act (ROSCA) by (1) failing to adequately disclose that its 12-month memberships would automatically renew indefinitely; (2) failing to disclose that extensions on 30-day free trial memberships at reduced rates would automatically renew indefinitely; (3) advertising “easy cancellation,” but creating a myriad of procedural hurdles to prevent cancellation; and (4) embedding pitfalls in the cancellation process to mislead customers into extending their memberships, as opposed to cancelling them. Furthermore, in some instances, even if a customer successfully navigated the cancellation process, ABCmouse would still charge for the cancelled services.

Continue Reading FTC Schools Marketers on the ABCs of Negative Option Marketing

Earlier this month, the FTC approved a settlement with a developer of popular apps for purported violations of the Children’s Online Privacy Protection Act (COPPA).  The Commissioners voted 4-1 to authorize the Department of Justice to file the complaint and the stipulated final order resolving the matter.  Under the stipulated final order, the company was ordered to pay a $4 million civil penalty (although all but $150,000 of it was suspended for inability to pay).  The lone dissent came from Commissioner Noah Phillips who issued a dissenting statement criticizing the “recent push to heighten financial penalties . . . without clear direction other than to maximize the amount in every case.”

Commissioner Phillips made the case, as he has before, that harm should be the starting point when fashioning a penalty.  Steeped in economic theory, he argued that “basing penalties on harm forces defendants to internalize the costs their behavior imposes on others, orienting conduct in a socially beneficial fashion.”  Chairman Simons also issued a statement, contending that starting with harm is “inapposite” when Congress explicitly prohibits practices and directs the agency to impose penalties.

Continue Reading Two Conservative, But Very Different, Approaches to Calculating Civil Penalties: Harm vs. Deterrence

The Federal Trade Commission held a workshop yesterday in Washington, D.C., to discuss possible updates to the COPPA Rule, which implements the Children’s Online Privacy Protection Act (“COPPA”). COPPA was originally enacted in 1998 and regulates the way entities collect data and personal information online from children under the age of 13. The Rule hasn’t been updated since 2013, and the intervening years have produced seismic technological advances and changes in business practices, including changes to platforms and apps hosting third-party content and marketing targeting kids, the growth of smart technology and the “Internet of Things,” educational technology, and more.

For the most part, FTC staff moderators didn’t tip their hand as to what we can expect to see in a proposed Rule revision. (One staff member was the exception, whose rapid-fire questions offered numerous counterpoints to industry positions, so much so that the audience would be forgiven for thinking they were momentarily watching oral argument at the Supreme Court.) Brief remarks from Commissioners Wilson and Phillips staked out their positions more clearly, but their individual views were so different that they too offered little assistance in predicting what a revised Rule may look like. Commissioner Wilson opened the workshop by sharing her own experience as a parent trying to navigate and supervise the games, apps and toys played by her children, and emphasized the need for regulation to keep up with the pace of technology to continue protecting children online. Commissioner Phillips also referred to his children at one point, but his remarks warned against regulation for regulation’s sake, flagged the chilling effect on content creation and diversity when businesses are saddled with greater compliance costs, and advocated a risk-based approach.

Continue Reading FTC’s COPPA Rule Workshop: A Summary of Priorities from Advocates and Industry, and the FTC’s Poker Face

Many in the industry are familiar with the following scenario. A young gamer, grinding tirelessly for untold hours perfecting her skill, honing her strategy, finally qualifies for an esports tournament. For that gamer, the true hard work begins after qualification. She now has to try to convince her parents to agree to let her participate, which may include travel (though compensated) to a far off location. In many cases, the first time the parents become aware that their child even entered a tournament (much less won an all-expense paid trip to an esports tournament) is this conversation—after the child has already been offered compensation to travel to and compete in the tournament.

If you are a game publisher, tournament organizer, or otherwise involved in the logistical chain of events described herein, there may be a big problem. The collection and use of data provided by children is regulated in the United States by the Children’s Online Privacy Protection Act (“COPPA”). COPPA is designed to protect the privacy of children by establishing certain requirements for websites that market to children. Most notably, COPPA requires website operators to obtain “verifiable parental consent” before collecting personal information from children. The FTC operates under the assumption that if children are the target demographic for a website, the website must assume that the person accessing the website is a child, and proper consent must be obtained. This assumption exists even if the website did not start with children as the target audience.

Continue Reading Update Required for Youth Esports

The National Advertising Division Annual Conference kicked off with Andrew Smith, the Director of the FTC’s Bureau of Consumer Protection, as the keynote speaker. Near the close of his remarks, Director Smith announced that the FTC will hold a workshop on the Children’s Online Privacy Protection Act (“COPPA”). For a refresher, COPPA is designed to protect the privacy of children by establishing certain requirements for websites that market to children. The FTC operates under the assumption that if children are the target demographic for a website, the website must assume that the person accessing the website is a child, and proper consent must be obtained. This assumption exists even if the website did not start with children as the target audience.

To illustrate this point, Director Smith discussed TikTok, a social media app that allows users to create and share short-form videos, which purchased Musical.ly, an app that allowed its users to post videos of themselves lip synching to songs. Musical.ly originally marketed to adults. However, as the website grew in popularity, it became clear that children used the website and that Musical.ly knew that children used the website. On February 27, 2019, the FTC brought a Complaint against Musical.ly alleging that Musical.ly collected information about children, but did not obtain the required parental consent to collect that information. In fact, child predators began using the website to obtain the location of children, though luckily, no child was hurt. As a result, TikTok agreed to pay $5.7 million to settle the FTC allegations.

Continue Reading A Morning Cup of COPPA From the NAD Annual Conference

With more and more children becoming technologically savvy, parents are having to rely more heavily on laws such as the Children’s Online Privacy Protection Act (“COPPA”) to shield their children’s information. The FTC recently issued a warning letter to a Ukraine-based company, Wildec LLC (“Wildec”), for allowing children under the age of thirteen to access its dating apps—alleging a potential violation of COPPA and the FTC Act.

A little background on COPPA: the FTC’s COPPA Rule prohibits companies from collecting, using, or sharing personal information from a child, which is defined as an individual under the age of thirteen, without the parent’s verifiable consent. In addition, companies must also provide a notice on its website stating what information is collected as well as any disclosure practices for such information.

Wildec’s dating apps collected an array of information from its users, such as email addresses, photographs, dates of birth, as well as a user’s real-time location data. Although the app’s privacy policy prohibited users under the age of thirteen, the FTC staff found that users who indicated they were under thirteen were not prevented from accessing and using the apps, and staff were able to locate individuals that indicated they were as young as twelve. In addition, the FTC noted in its warning letter that “facilitating other users’—including adults’—ability to identify and communicate with children—even those 13 or over—poses a significant risk to children’s health and safety.” Following the allegations, the apps were swiftly removed from Google Play and Apple’s App Store.

Continue Reading FTC Warns Ukraine Company: You Can’t Let Kids Use Your Dating Apps

The Commissioners of the FTC agreed, during an oversight hearing on November 27, 2018, to investigate the use of “loot boxes” in video games. Senator Hassan (D-NH), following up on questions she asked the newly appointed Commissioners during their confirmation hearings, specifically requested the FTC investigate loot boxes citing addiction concerns, (especially as it relates to children) and the resemblance of loot boxes in video games to gambling.

A loot box is a digital container of virtual goods that a user can purchase in-game using real-world currency. A user does not know what is in the loot box before purchasing. The loot box may contain digital goods (such as character skins, tools, weapons, etc.) that the user can use in the game. Importantly, the user cannot choose the contents of the loot box. The box could contain an extremely rare/sought-after item or the contents could be a collection of items already owned by the user (or somewhere in between).

Continue Reading The FTC is Searching for the Value in Loot

A few years ago, tech companies were confronted with a common complaint from parents: their children were inadvertently spending lots of money on in-app purchases while using children’s apps. Although this led to the implementation of expanded parental control settings, children’s app developers stayed the course. Last month, however, three senators asked the FTC to