A Facebook executive said Sunday that the company will introduce new measures on its app to keep teens away from harmful content, as US lawmakers investigate how subsidiaries like Facebook and Instagram affect the mental health of youth. does.
Nick Clegg, of facebook The vice president of global affairs also expressed openness to the idea of giving regulators access to Facebook algorithms that are used to enhance content. But Clegg said he could not answer the question of whether its algorithm amplified the voices of those who attacked the US Capitol on January 6.
Algorithms “should, if necessary, be taken into account by regulation so that people can match what actually happens according to our system.” Clegg told CNN’s “State of the Union.”
“We’re going to introduce something that I think will make a big difference, where our systems see that teens are watching the same content over and over and it’s content that may not be conducive to their well-being. ‘We’ll inspire them to see other content,'” Clegg told CNN.
Also, “We are presenting by some name, ‘take a break,’ where we’d encourage teens to just take a break from using Instagram,” Clegg said.
US senators last week told Facebook about a plan to better protect young users on its apps, drawing on leaked internal research showing that the social media giant was aware that its Instagram app stifled youth’s mental health. How to harm health
Democrat Senator Amy Klobuchar, who chairs the antitrust subcommittee of the Senate Judiciary Committee, has argued for more regulation against technology companies like Facebook.
“I’m tired of hearing ‘trust us,'” Klobuchar told CNN on Sunday, and it’s time to protect moms and dads who keep their kids addicted to the platform and exposed to all kinds of bad things. battling with.” After interviewing Clegg.
© Thomson Reuters 2021