Who’s Spying On Your Kids? Everyone.

an illustration of a child laying in front of a computer with a red light overlaying their body

Photo credit: Amina Filkins/Pexels

The primary law protecting children from advertising is the children’s online Privacy Protection act of 1998, or COPPA. It’s enforced by the Federal Trade Commission (FTC) and has the goal of letting parents control what personal data is gathered from children under 13. It imposes several requirements on operators of websites or online services directed at children in that age group, and on operators of other websites, and programs or apps that collect and share personal information at home or at school. Marketers cannot lawfully target youth under age 13 with advertising without parental permission. Personal information can be a child’s name or screen name, email, telephone number, geolocation, photo, voice recording, or other unique identifier.

The video sharing social networking service TikTok, just like McDonald’s, is a flagrant collector of kids’ data. Two advocacy groups have complained to the FTC about TikTok’s endless feed of jokes, dances, recipes, and “challenges.” The Boston-based Campaign for a Commercial Free Childhood (ccFc) and the Center for Digital Democracy call the popular app’s business model “one of the most predatory.” In 2019, TikTok’s parent company, ByteDance, paid a fine of $5.7 million, and the FTC prohibited it from collecting and using data from kids under the age of 13. It also agreed to destroy data already collected.

Attorneys at Georgetown Law’s Institute for Public Representation found that TikTok was violating the settlement, scooping up data without parental permission. CCFC director Josh Golin told the New York Times that this marketing practice puts millions of underage children at risk of sexual predation. So the groups filed another complaint, supported by 20 leading advocacy organizations, describing TikTok’s ongoing violations of children’s privacy. It had not deleted the personal data of kids under 13—a clear violation of the settlement. The company claims it does not need parental permission because “regular” TikTok accounts are for users 13 and up. Kids under 13, TikTok says, can only sign up for “younger users” accounts, with limited functionality; they cannot upload videos or message other users, two core elements of the app.

The investigation found that the new privacy policy and the policy of allowing only children under 13 to access new “younger user” accounts lacked teeth. Kids can skirt the limited functionality of the app’s “young” version and can lie about their age, and even if children do use the “younger user” accounts, TikTok violated the law with its algorithmically curated video feed anyway. The company gathers data about what and how long kids watch to find ways to keep them interacting longer. TikTok is so wealthy that a $5.7 million fine is built into their cost of doing business. The FTC complaint calls for holding TikTok executives accountable and for levying the maximum penalties of $41,484 per violation, and until TikTok can adopt an effective age-verification policy and be COPPA–compliant, child advocacy groups are urging the FTC to prevent TikTok from registering any new users in the United States.

Period Aisle Advertisement

In 2020 President Donald Trump signed an executive order calling TikTok a security threat because of ByteDance’s ties to China. A federal judge ruled against a government ban on TikTok’s operation in the United States, but as of early 2021 litigation was still pending and ByteDance was pursuing a deal with U.S. firms Oracle and Walmart that could skirt the government’s attempted ban. In 2019, Google’s subsidiary YouTube agreed to pay $170 million for violating COPPA, the largest claim the FTC has obtained since the law was enacted in 1998. YouTube had failed to divulge that parts of its platform were aimed at children under age13. YouTube then gathered personal data without parental consent, using cookies. New York Attorney General Letitia James said: “These companies put children at risk, and abused their power, which is why we are imposing major reforms to their practices and making them pay one of the largest settlements for a privacy matter in U.S. history. Google and YouTube must create and implement a system by which channel owners can identify child-directed content on YouTube.

With the European General Data Protection regulations (GDPR), which went into force in 2018, more are waking up to the need to protect juveniles’ data. The GDPR calls for special protections for children under age 16—compared to COPPA’s age 13—such as adopting measures to verify a child’s age and managing consent. Although legal and regulatory protections haven’t kept pace with technological methods used to target and expose children to corporate persuasion, that may be changing. Despite more protections, it will always be the goal of multinational corporations to prime children to be consumers before they learn what it means to be informed and engaged citizens. Adam Jasinski, a technology director for a school district outside of St. Louis, Missouri, used to conduct keyword searches of the official school email accounts for the district’s 2,600 students, looking for words like “suicide” or “marijuana.”

In 2018, he learned that tech company Bark was offering schools free automated monitoring software in the wake of the 2018 Parkland, Florida, school shooting, in which 17 people were killed. The software scans school emails, shared documents and chat messages, and alerts school officials anytime key phrases are found. Proponents claim such monitoring takes the burden off other students to report on their classmates and lets administrators react in near real time. Officials are looking for cyberbullying, self-harm, and shooting threats. It’s cost effective and on the alert 24 hours a day. All told, the school surveillance industry is a $3 billion a year industry. Facial recognition software in schools is being used to avert mass shootings. In 2020, the New York Civil Liberties Union sued the New York State education department for its $1.4 million facial recognition system.

I Have Nothing to Hide is a white book cover that features an illustration of a beige, brown, and tan fingerprint in the center

I Have Nothing to Hide by Heidi Boghosian (Photo credit: Beacon Press)

The Lockport school district was one of the first public schools in the nation to use the technology on students and staff. Alerts pop up under their system when suspended staff members, Level 2 or 3 sex offenders (moderate- or high-risk of reoffending), persons barred by court order from school property, or others deemed to pose a threat are seen on campus. Advocacy groups try to tip the balance in order to slow down corporate data harvesting. The task is daunting in scope. Groups have asked congress to enact greater privacy protections for children. In late 2019, the D.C.-based Electronic Privacy Information Center (EPIC) urged the FTC to reject a “school official exception,” which allowed schools to share information with parent volunteers, tech companies, or other vendors for educational purposes directed by the school.

It called on the FTC to define the term “commercial purpose” and ensure that children’s personal data collected by schools wasn’t being transferred to EdTech companies that provide hardware and software to enhance teacher-led learning in classrooms. EPIC also urged the FTC to require notification within 48 hours of a data breach by a company subject to COPPA. A bipartisan group of senators called on the FTC to launch an investigation into children’s data practices in the EdTech and digital advertising sectors. They wrote: “The FTC should use its investigatory powers to better understand commercial entities that engage in online advertising to children—especially how those commercial entities are shifting their marketing strategies in response to the coronavirus pan- demic and increased screen time among children.”

How society treats its most susceptible members is revealing. Rather than sparking curiosity in children about social issues, corporations would prefer children center their worlds around having the latest toys, clothes, and junk food. Monitoring minors’ play habits and collecting their personal information does more than shape their future consumer habits as adults—it enables the normalization of a surveillance society. Corporate spying exploits our earliest innate cravings for excitement and interaction in order to extract private information, and conditions juveniles to accept surveillance. The business model is lucrative, and because sanctions lack teeth, many tech companies line-item judicial settlements as the cost of doing business. Aggressive and deceptive KidTech marketing techniques speak volumes about societal priorities. When it comes to safeguarding the rights of the young, the United States, like a foul-prone soccer player, is a prime candidate for a red card.

 

Heidi Boghosian is a white woman with her brown hair pulled back behind her ears
by Heidi Boghosian
View profile »

Heidi Boghosian is an attorney and cohost of Law & Disorder Radio. She is executive director of the A.J. Muste Institute, a charitable foundation supporting activist organizations. She was previously executive director of the National Lawyers Guild. Boghosian has written numerous articles and reports on policing and activism, and is the author of Spying on Democracy: Government Surveillance, Corporate Power, and Public Resistance. She received her JD from Temple Law School where she was editor in chief of the Temple Political & Civil Rights Law Review.