More than 180,000 content was removed from Facebook and Instagram pages and accounts in Australia last year for violating META’s community standards due to health-related misinformation.
It should be noted that the figure of 180,000 was up from 110,000 in 2020.
Meta said Australians benefited from content removed from other countries as well, with a total of 11 million people affected globally.
The data was disclosed in a meta transparency report focused on Australia and released by the Digital Industry Group (DIGI) as part of the monitoring of the Australian Code of Practice on Disinformation and Misinformation.
Meta reported more than 3.5 million visits from Australian users to an information center dedicated to COVID-19 on its platforms in the fourth quarter of 2021. From the start of the pandemic to June 2021, it has removed more than 3,000 accounts, pages and groups for violating its rules against spreading COVID-19 and vaccine misinformation.
It also noted that some commentators have expressed concern that social media spreads
Misinformation promoting echo chambers and polarization.
The report also noted that academic research on the role of algorithms in political and social polarization has yielded conflicting results, with several studies indicating that social media is not the primary cause of polarization.
“Nevertheless, META aims to provide more transparency and control for users in how algorithms rank and distribute content. To this end, we have included an additional commitment to providing transparency at work here,” META he said.
It is noteworthy that Google, Microsoft, TikTok, Twitter, Facebook and Redbubble (an online marketplace for print-on-demand products) joined DIGI’s Voluntary Code of Practice to fight the spread of propaganda in Australia in February 2021.
However, since its inception, the code has gained two more signatories which are Adobe and Apple.
It should be noted that the signatories to the Australian Code of Practice on Misinformation and Misinformation are committed to measures to combat online lies, including publishing and adopting policies on their approach and reporting content that violates those policies to users. Includes permission to do so.
Part of the Code’s pledge is the publication of transparency reports on each company’s activities on their individual platforms.
DIGI said: “If we can increase our understanding of these complex challenges over time, industry, government, civil society and academia can all continuously improve their policies and approaches.”
DIGI Managing Director Sunita Bose reportedly stated that the code promotes greater transparency and public accountability in technical efforts to combat harmful misinformation, and to maximize its effectiveness with the DIGI incoming administration and others. look forward to cooperating.
“The Transparency Report for 2021 provides new data on misinformation in Australia, and advances multiple interventions, reputable content and partners with researchers to remove and flag fake claims and accounts,” she said.
According to a Google report, the tech giant removed more than 90,000 YouTube videos in Australia that violated its Community Guidelines, and more than 5,000 videos uploaded from Australia that contained dangerous or misleading COVID-19 information.
Additionally, the number of Australian medical misinformation videos removed from TikTok increased dramatically in 2021, with only 24 removed in January and over 4,000 in September.
TikTok reportedly said that the increase in medical misinformation removals coincided directly with factors related to COVID-19 such as the arrival of the delta strain, and government-initiated infection-management measures such as lockdowns and travel restrictions, and Parallel rollout of vaccination programme.
Read all the latest news, breaking news and IPL 2022 live updates here.