(NBC News) - Executives from Snapchat, TikTok and YouTube distanced themselves from Facebook during a congressional hearing Tuesday about online safety for teens.
Members of a subcommittee of the Senate Commerce Committee grilled the tech company representatives during a hearing titled “Protecting Kids Online: Snapchat, TikTok and YouTube.”
Facebook was not present at the hearing, having already been questioned on Instagram’s harmful effect on children by the committee last month. But its presence loomed large after weeks of news coverage of documents leaked to The Wall Street Journal and other media outlets, including NBC News, by Facebook whistleblower Frances Haugen.
Many of the questions aimed at Snap’s head of public policy Jennifer Stout, TikTok’s head of public policy Michael Beckerman and YouTube’s head of government affairs and public policy Leslie Miller centered around the harmful effects of Instagram on teen mental health and body image detailed in internal research leaked by Haugen, as well as the role of algorithms in pushing teens toward harmful content.
“I understand from your testimony that your defense is: ‘We’re not Facebook. We’re different and we’re different from each other.' Being different from Facebook is not a defense,” said Sen. Richard Blumenthal, D-Conn., and chair of the Subcommittee on Consumer Protection, Product Safety and Data Security who convened the hearing with Sen. Marsha Blackburn, R-Tenn., during his opening statement.
“For too long we have allowed platforms to promote and glorify dangerous content to teens and young users,” added Blackburn. “How long are we going to let this continue? What will it take for platforms to crack down on viral challenges, illicit drugs, eating disorder content and child sexual abuse material?”
Stout said that Snapchat was created as an “antidote to social media” and highlighted how the app differed from other social media platforms such as Facebook and Instagram. She said Snapchat focused on privacy, with messages and images deleting by default, and connecting people who know each other in real life, rather than showing them a feed of videos and images from strangers.
“You weren’t being judged on your perfect posts,” she added.
Miller said that the versions of YouTube aimed at kids and teens, YouTube Kids and YouTube supervised experiences, did not support features such as comments or live chat and did not allow personalized advertising. She said that the platform has policies against harmful content such as videos promoting eating disorders and deletes them through a combination of automated tools and human review.
“We do not prioritize profits over safety,” she said. “We invest heavily in making sure that our platforms are safe for our users. And we do not wait to act. We put systems and practices and protocols in place."
The FTC fined YouTube’s parent company Google $170 million in 2019 for violating children’s privacy laws. It was accused of collecting data on children under 13 on the main YouTube platform without parental consent.
Beckerman said that TikTok was an entertainment platform that did not focus on direct messaging between users. “It’s about uplifting, entertaining content,” he said. “People love it.”
“We’ve made difficult policy and difficult product choices that put the well-being of teens first,” he said, pointing to not allowing direct messages for under-16s and building family controls.
During questioning by Sen. Cynthia Lummis, R-Wy., about TikTok’s data collection practices, including its collection of keystroke data, which involves collecting data relating to users’ typing behavior, she asked which other technology platforms collected more data than TikTok.
“Facebook and Instagram,” he said.
Sen. Edward Markey, D-Mass., pushed the company representatives to disclose whether they would support the Children and Teens’ Online Privacy Protection Act, a bill designed to improve online privacy protections for young people, including banning targeted ads aimed at 13- to 15-year-olds, building on existing legislation banning targeting of children under age 13.
He grew frustrated when the company representatives failed to offer definitive answers, including Snap’s Stout saying that the company agreed that teens deserved better privacy protections and “we’d love to talk to you a bit more.”
“This is what drives us crazy,” Markey said. “‘We wanna talk, we wanna talk.’ This bill has been out there for years. Do you support it or not?”
David Monahan, campaign manager for the child safety group Fairplay (formerly the Campaign for a Commercial-Free Childhood, responded to the hearing with mixed emotions.
“We’re encouraged to see Congress shed further light on the harms children face online, including tragic cases of dangerous viral challenges, bullying, and manipulative influencer marketing,” he wrote. “It’s disappointing that so often the witnesses asked the Senators to ‘Just trust us.’”
Facebook’s global head of safety Antigone Davis faced similar questioning at the end of September. The hearing was triggered by concern over Facebook’s plans to launch a version of Instagram for children under age 13 as well as reporting from The Wall Street Journal that explored how Instagram could be harmful to teens.
Days before the hearing, Adam Mosseri, head of Instagram, said the company had paused development for a version of the photo-sharing app for children.
“I still firmly believe that it’s a good thing to build a version of Instagram that’s designed to be safe for tweens. But we want to take the time to talk to parents and researchers and safety experts and get to more consensus about how to move forward,” he told the “TODAY” show’s Craig Melvin. “If anybody leaves using Instagram feeling worse about themselves, that’s an important issue we need to take seriously and that we need to figure out how to address.”