Monday, October 25, 2021

Whistleblower: Facebook harms children, fosters division, undermines democracy

October 5 (WNN) — Facebook whistleblower Frances Haugen testified in the Senate on Tuesday that the social media company has long been aware of misinformation and hate speech on the platform and the negative effects on young users.

During the hearing, Haugen explained to the Senate Commerce Committee panel that he believes Facebook’s Instagram platform negatively affects children.

“I am here today because I believe Facebook’s products harm children, promote division and undermine our democracy,” Haugen said in his inaugural address.

“The company’s leadership knows how to make Facebook and Instagram secure, but they won’t make the necessary changes because they put their astronomical advantage to the public,” he said. “Congress needs action. They will not solve this crisis without your help.”

Haugen said the social media site’s algorithms can move kids quickly from safe content like healthy recipes to content about eating disorders. He called on lawmakers to demand more transparency in the company’s algorithms and internal metrics on how the company should be regulated.

Haugen targets Section 230 of the Communications Civilization Act, which protects platforms from legal liability for content posted by their users. She suggested exempting platform decisions about algorithms from those protections, which would expose the company to lawsuits over how content is ranked on users’ feeds.

“Facebook knows that content that elicits an overwhelming response from you is more likely to be shared with a click, a comment, or re-shared,” Haugen said. “Those clicks and comments and re-sharing is not necessarily to your advantage, but because they know other people will produce more content if they get likes and comments and re-shares.”

Monica Bickert, Facebook’s vice president of content policy, said it is “not true” that the platform’s algorithms are designed to push inflammatory content.

“We do the opposite, in fact, and if you look into our Transparency Center, you can actually see that we demote, which means we reduce the visibility of the Engagement Bait, clicking on the bait. And why would we do it? A big reason is the long-term health of our services, we want people to have a great experience,” Bickert told CNN.

During his testimony, Haugen said that the company’s system for catching offensive material such as hate speech catches “a very small minority of abusive material.” Because of the company’s “deep focus on scale,” it’s unlikely to ever capture more than 10% to 20% of objectionable content.

Haugen also said that the Facebook platform is “definitely” being used by “authoritarian or terrorist-based leaders” around the world.

“My team worked directly on tracking Chinese participation on the platform, surveying the Uighur population, in locations around the world. You can actually find Chinese people doing this kind of work based on them,” she said. said. “We also saw the active involvement of the Iranian government spying on other state actors.”

Despite the national security threat, Haugen said he does not believe Facebook is adequately prepared to monitor and combat this behavior.

“Facebook’s persistent espionage information operations and lack of counter-terrorism teams is a national security issue, and I’m talking to other parts of Congress about it … I have strong national security concerns about how Facebook works today. does.”

Haugen identifies himself as a whistleblower on CBS’ 60 minutes on Sunday, it said that Facebook prioritized profits over public safety and was aware of research that shows the negative impact of some policies on younger users.

Read Also:  AP Source: Manchin Accepts Biden's Wealth Tax

A former data scientist for Facebook, Haugen is pushing Congress for new rules that address the concerns he has raised.

“When we realized that tobacco companies were hiding the harm it caused, the government acted,” she said in her opening statement.

“When we found out that cars with seat belts are safe, the government took action. And today, the government is taking action against companies that hide evidence on opioids. I beg you here to do the same.” “

Haugen said it has filed at least eight complaints with the Securities and Exchange Commission, claiming that Facebook is hiding important research from investors and the public.

Facebook pushed back against Haugen’s allegations, saying it does not encourage “bad content” and works relentlessly to root out harmful information.

“We have invested heavily in people and technology to keep our platform secure, and have made fighting misinformation and providing authoritative information a priority,” Facebook’s director of policy communications, Lena Pietsch, told CBS News on Sunday.

“If any research had identified precise solutions to these complex challenges, the tech industry, governments and societies would have solved them much earlier,” Pitsch said. “We have a strong track record of using our research – as well as outside research and close collaboration with experts and organizations – to inform changes to our apps.”

D-Con’s Sen. Richard Blumenthal on Tuesday urged Facebook founder and CEO Mark Zuckerberg to testify before the committee in response to the allegations.

“Mark Zuckerberg should look at himself in the mirror today, and yet, instead of taking responsibility and showing leadership, Mr Zuckerberg is sailing,” Blumenthal said. “No apologies, no admissions, no action, nothing to see here. Mark Zuckerberg, you need to come before this committee, you need Francis Hogen, us, the world, and America’s parents need to explain what you were doing and why you did it.”

Zuckerberg shared a statement Tuesday night that was also sent to company employees in a Facebook post, refuting the allegations made in Hogen’s testimony.

“Now that today’s testimony is over, I wanted to reflect on the public debate we’re in. I’m sure many of you have found the recent coverage difficult to read because it doesn’t reflect that company.” is what we know,” he wrote. “We care deeply about issues like safety, well-being and mental health. It’s hard to see coverage that misrepresents our work and our objectives.”

Zuckerberg said the allegation that Facebook prioritizes profit over safety and well-being is “not at all true.”

He wrote, “The argument that we intentionally promote content that annoys people for profit is very illogical. We make money from ads and advertisers constantly tell us that they consider their ads harmful or angry. Don’t want next to the content.” “And I don’t know any tech company that makes products that make people angry or sad. Ethics, business, and product promotion all point in the opposite direction.”

He also highlighted services like Messenger Kids that are targeted towards protecting children.

“Of everything published, I am particularly focused on the questions raised about our work with children,” Zuckerberg wrote. “I’ve spent a lot of time reflecting on the types of experiences my kids and others have online, and it’s very important to me that everything we create is safe and good for kids.”

.

Nation World News Deskhttps://nationworldnews.com
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news
- Advertisement -