Meta adds new age -based Facebook restrictions, Messenger to curb inappropriate content

Parents will “like” this.

Meta has added “teenage accounts” on Facebook and Messenger to limit who can contact minors and control the content for which they have been exposed.

On Tuesday, the technology that announced that users under 18 will automatically register on these accounts in an attack “to give parents more mental peace across meta applications” and will curb exposure to inappropriate content.

On Tuesday, the technology giant announced that users under 18 will automatically register in these accounts. Meta

Meta told Techcrunch that teens will only receive messages from people who follow or have messages before. Only their friends will be able to see and respond to stories and labels, mention and comments will be limited to them on their network.

Teens will also be sent notifications to close applications after one hour on the screen and place their applications in “Mode Quiet” at night.

Users under 16 need a parent’s permission to change settings to be less strict.

These defenses will be rolled in the US, UK, Australia and Canada before expanding elsewhere.

Similar security features were added to Instagram last year after guards and lawmakers continued to crack down on the lack of protection of social media companies against children, amid concern for increasing mental health issues.

Users under 16 need a parent’s permission to change settings to be less strict. Meta

Together with the recently added feelings on Facebook and Messenger, Instagram allows parents to see what accounts their child has recent messages, set daily deadlines and block adolescents from using the app over specific periods.

In the latest update released on Tuesday, Meta also added protection by blocking tender 16 from going “live”, taking “unwanted images” and unclear images suspected of containing nakedness, all without the permission of a parent.

Meta claims that 97% of adolescents aged 13 to 15 have kept these construction restrictions on their accounts since they were first added last year and that 94% of parents say these restrictions are “useful”.

These defenses will be rolled in the US, UK, Australia and Canada before expanding elsewhere. Antoniodiaz – Stock.adobe.com

However, since the beginning of these changes, many security groups and parents online have insisted that security improvements are inappropriate.

Last summer of US surgeon Vivek Murty called the implementation of a tobacco -style “warning labels” for social media applications to raise awareness of their potential mental health risks, including depression and anxiety.

Similar security features were added to Instagram last year after guards and lawmakers continued to hit the lack of protection of social media companies against children, as it worries about the link between these applications and the growing mental health crisis has been strengthened. Prima91 – stock.adobe.com

Last fall, a coalition of state lawyers sued Meta, claiming that the company has religious features in addictive features to connect children and increase profits at the expense of their mental health.

“Meta can push as many” children’s traits “focused on adolescence as he wishes, will not change the fact that his main business model is projected to benefit and encourage children and adolescents to be added to his products and American parents are wise.

Another guard, the technique transparency project, argued that Meta “has claimed for years to apply” versions of detailed traits in the initial movement.

For example, Meta initially announced plans to make private teens accounts as default and limit their interactions with foreigners by 2021, according to previous blog posts.

#Meta #adds #age #based #Facebook #restrictions #Messenger #curb #inappropriate #content
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top