Social media and video streaming services (SMVSSs), including Amazon, Alphabet-owned YouTube, Meta’s Facebook and TikTok, are engaging in a “vast surveillance of users” to profit off their personal information. The report also raised concerns about the lack of meaningful privacy safeguards for children and teens.
“These surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking,” said FTC Chair Lina Khan in a statement.
The findings stem from a special investigation launched in December 2020, targeting nine major companies in the U.S. to understand how their platforms impact American consumers. The report, based on responses from these companies, reveals troubling data collection practices, and the widespread use of artificial intelligence (AI) for decision-making.
The FTC report highlights that many SMVSSs collect and retain vast amounts of data about both users and non-users, often without their full awareness. This includes personal information, browsing behaviors, and even third-party data such as household income and location. Several companies were found to have inadequate data minimization and retention policies, with some retaining personal information indefinitely or only de-identifying data rather than fully deleting it.
According to the report, the privacy risks posed by these companies’ data practices are severe. Consumers often have little control over how their information is used or stored, and many entities fail to provide clear policies on data deletion. Some platforms collect sensitive information through tracking technologies like pixels, which transmit user activity to third parties without user consent.
Automated Decision-Making
The FTC also raised alarms about the widespread use of algorithms, data analytics, and AI across these platforms. These automated systems power everything from content recommendations to ad targeting, often without giving users any control over how their data is utilized. This is particularly concerning for non-users who may not have consented to any data collection.
Users are not only unable to control how their data is used in AI-driven systems, but they also lack transparency in understanding the decisions these systems make. The report stresses that this opacity can have harmful consequences, particularly when algorithms prioritize content that could be detrimental to users’ mental health, particularly for children and teens.
The Impact on Children and Teens
The report is particularly critical of how these services and platforms handle the privacy of children and teens. Although most companies claim to comply with the Children’s Online Privacy Protection Rule (COPPA), many fail to extend protections to teens. In fact, most platforms treat teens similarly to adult users, collecting and using their personal data with minimal restrictions. Despite companies asserting that children are not permitted to use their platforms, the reality is that many do, raising serious privacy risks.
The risks to the safety of teens and children online are “especially troubling,” Khan added.
The FTC calls for stronger measures to protect young users, emphasizing that platforms must recognize teens’ unique vulnerabilities and provide greater privacy safeguards.
The Floor, Not the Ceiling
The report said the Children’s Online Privacy Protection Rule (COPPA) “should be the floor, not the ceiling,” and these platforms and services “should view the COPPA Rule as representing the minimum requirements and provide additional safety measures for Children as appropriate.”
It added that ignoring child users or failing to address them does not exempt companies from liability under COPPA (Children’s Online Privacy Protection Act). Platforms should implement strict policies for when children are discovered on their services, even if children are not officially allowed.
The report also criticizes SMVSSs for not doing enough to safeguard teen users, suggesting that teens should have privacy-protective settings by default, limited data collection, and clearer policies on data retention. It urges Congress to pass federal legislation that specifically addresses the privacy and protection of teenagers, as COPPA only covers children under 13.
Moreover, the FTC recommends that these platforms give parents and legal guardians easy access to manage their child’s personal information, including deletion requests. In the absence of new federal laws, multiple agencies, including the FTC and the Department of Justice, are applying their existing authority to regulate data practices and AI use across sectors.
The report serves as a wake-up call for both the industry and regulators, highlighting the urgent need for stronger privacy protections in the digital age.
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.