Executives from TikTok, Snapchat, and YouTube were questioned by senators about what they’re doing to ensure young users’ safety on each platform.
Social platforms offer entertainment to many individuals but to kids who are growing up and don’t know the direction they are going, social media is there to help guide them. But these platforms are being misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing.
Kids are previewed to a lot of content ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs.
Lawmakers wanted the executives’ support for an increase in the legislation protection of children on social media. But they received little firm commitment.
The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy.
“The problem is clear: Big Tech preys on children and teens to make more money,” Sen. Edward Markey, D-Mass., said at a hearing by the Senate Commerce subcommittee on consumer protection.
The subcommittee recently took testimony from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens. With the provided information, they are widening their focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.
TikTok has tools in place, such as screen-time management, to help young people and parents moderate how long children spend on the app and what they see, Michael Beckerman, TikTok vice president and head of public policy for the Americas said.
The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance.
Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. made the case that Snapchat’s platform differs from the others in relying on humans, not artificial intelligence, for moderating content.
Vice president for government affairs and public policy of YouTube’s owner Google, Leslie Miller said YouTube has worked to provide children and families with protections and parental controls when it comes to time limits and limit viewing to age-appropriate content.
All of these platforms are most kids’ daily life and sometimes parents can’t control what their kids are previewed to when outside of their household. Social platforms are influencing young people’s minds whether it’s from music or the clothes they wear.
This topic will continue to be discussed and regulated for the benefit and well being of the youth.