Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 5, 2020

Who’s allowed to track my kids online?

What the United States’ children’s privacy law does and doesn’t do


Who’s allowed to track my kids online?

For two decades children’s digital privacy in the United States has been regulated by a national law: the Children’s Online Privacy Protection Act. The law limits how companies can collect data on children under 13 years old.

If website operators don’t properly adhere to the rules outlined in the act, they could face massive fines. In September, the Federal Trade Commission announced a record-breaking penalty against YouTube, in which the Google-owned service agreed to pay $170 million to settle alleged COPPA violations.

But the law has also faced criticism, with some lawmakers and advocates arguing that it doesn’t go far enough. Here’s what it does and does not do.

What COPPA does

COPPA, which went into effect in 2000, requires websites and services that want to collect personal information about children under 13—including real names, screen names, or contact information—to post privacy policies and get parental consent before obtaining the data.

In 2013, the FTC expanded the rules to require sites or services to get parental permission before collecting geolocation information, photos, video, and any “persistent identifiers,” like cookies. To meet the requirement, some services may have a parent sign a consent form or call a phone number, but many simply ask for age at sign-up and stop users who say they are under 13 from joining.

The law includes fines for companies that fail to comply. In one notable early case, from 2003, Mrs. Fields Cookies and Hershey’s Foods paid civil penalties of $100,000 and $85,000, respectively, to settle allegations that portions of their websites improperly collected data on children.

A few years later, the fines were already escalating: The social networking service Xanga paid $1 million in 2006 to settle FTC charges that it had created more than 1.7 million Xanga accounts for users who provided information indicating they were under 13 years old.

Last year, in another record for the time, Musical.ly—which has since become TikTok—agreed to pay $5.7 million to settle alleged violations of COPPA. That record was shortly eclipsed by the $170 million YouTube penalty.  (YouTube has since changed its rules concerning children’s content, disabling some features and limiting data collection for videos aimed at kids.)

What COPPA doesn’t do

The law applies to websites that are aimed at children under 13, but that standard is somewhat vague: If a service uses animated characters in ads, for example, or the service’s subject matter is something clearly appealing to kids, like toys, it might fall under COPPA.

General-interest sites are also responsible for complying with COPPA if they have “actual knowledge” that they’re collecting data on kids under 13. The standard, critics argue, is far from onerous: A website may ask for age at sign-up but isn’t required to verify it. Popular services like YouTube and Facebook ask for age at sign-up, but there’s little stopping users from lying. Instead, services say they terminate users’ accounts if moderators determine the user is under age.

One 2006 comment to the FTC, cited in a report on COPPA, noted “there is no conceivable way, short of locking a child in a closet and not letting him out until adulthood, to absolutely prevent a child from viewing age inappropriate websites.”

The FTC itself has also been criticized for the perception that it’s overly lax in enforcing the law—advocacy groups have argued that even massive penalties, like the 2019 YouTube fine, don’t go far enough to deter companies from violating COPPA. The Electronic Privacy Information Center, for example, has said “the FTC has not adequately enforced COPPA in recent years, failing to act on complaints in a timely way.” The agency has also been broadly criticized as being under-resourced.

While press releases might trumpet the size of fines, those penalties are still only a traffic ticket for some major companies. YouTube’s $170 million fine, for example, amounted to just a tiny fraction of the company’s 2019 revenue of $15 billion, which in turn was only about 10 percent of overall Google revenue.

“We think there should be stronger enforcement of COPPA,” Ariel Fox Johnson, senior counsel for policy and privacy at Common Sense Media, said. She said the “actual knowledge” standard could be lowered to “constructive knowledge.” Under that standard, website operators wouldn’t have to be directly informed that kids are on their service—the FTC would have to prove only that, had companies done their due diligence, they would have known they were collecting data on children.

Where will COPPA go next?

As countries like the United Kingdom pass their own youth privacy laws, officials have taken a closer look at COPPA. Last year, the FTC announced that it was seeking comment on potential changes to its COPPA enforcement practices. “In light of rapid technological changes that impact the online children’s marketplace, we must ensure COPPA remains effective,” FTC chairman Joe Simons said in a July statement. The review has raised concerns that the agency will cave to pressure from the tech industry to weaken the law.

Congress has also considered changes. A bill from Rep. Kathy Castor (D-FL) would create new protections for teenagers between the ages of 13 and 17 and expand COPPA to explicitly include protections for data like biometric, health, and educational information. The bill would also give the FTC power to pursue higher financial penalties.

A different plan, introduced last year by Sen. Edward J. Markey (D-MA) and Sen. Josh Hawley (R-MO), would create what they’ve pitched as a “COPPA 2.0”—an update to the law that would expand the ages covered by the law, requiring services to get user consent before tracking teens between 13 and 15 years old. Under the bill, the FTC would also create a division dedicated to examining youth privacy issues.

“Right now you turn 13, and you’re treated like a 35-year-old online,” Johnson said. The bill would also set a blanket ban on targeted advertising toward children under 13.

“If we can agree on anything,” Markey said in a statement announcing his and Hawley’s legislation at the time, “it should be that children deserve strong and effective protections online.”

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license. Do you have a question for Ask The Markup? Email us at ask@themarkup.org

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with