Ad Code

Responsive Advertisement

SENATE HEARING: 1 In 4 Girls Aged 13-15 Experiences Unwanted Sexual Advances On Instagram

In a recent Senate hearing focusing on the impact of social media on children, Mark Zuckerberg, CEO of Meta, issued a dramatic apology to families who claimed their children were harmed by social media use. The hearing, held on Capitol Hill, explored issues such as child sexual exploitation online and featured video testimonies from children discussing their experiences with online bullying, abuse, and more.

During the hearing, Senator Josh Hawley (R-MO) stated that more than 1 in 3 Instagram users aged 13-15 had encountered nudity on the platform, despite Instagram's no-nudity policy. Additionally, almost a quarter of Instagram users aged 13-15 had been the victim of unwanted sexual advances, and 1 in 6 had experienced content pushing them to self harm.

During the hearing, committee chair Dick Durbin criticized the platforms for failing to adequately protect children. Republican Sen. Lindsey Graham even accused Zuckerberg of having "blood on his hands" due to the negative impacts of his product. 

Get fact-checked news straight in your inbox. Click here to subscribe to my weekly newsletter

When asked by Republican senator Josh Hawley if he would like to apologize to victims harmed by his product, Zuckerberg addressed the families directly, saying, "I'm sorry for everything you have all been through... No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer".

Facebook's Alleged History Of Inadequate Safeguards

Frances Haugen, a former Facebook product manager, has shed light on the alarming practices within the social media giant. Haugen's whistleblowing efforts have brought attention to the potential harm faced by minors on Facebook and Instagram, two platforms owned by Meta, the parent company of Facebook.

Haugen's revelations indicate that Facebook's algorithms favor profitability over safety, leading to exposure of minors to potentially harmful content. She stated that Facebook chose profits over safety, a claim that has sparked significant concern among policymakers and consumer advocates.

Haugen also highlighted the fact that Facebook's internal teams were often unable to secure necessary resources for safety measures. For instance, she claimed that Facebook was too late in intervening with safety measures in Ethiopia, where ethnic violence was being amplified by Facebook's platforms.

*AI contributed to the writing of this article (click here to learn what that means)

view our policy
 


Post a Comment

0 Comments