By Staff Writer.
Meta Australia’s public policy director Mia Garlick has flagged increased content monitoring across its social media platforms as Australia’s federal election looms. The social media giant says it takes a hardline approach on misrepresentation and misinformation that could result in electoral interference.
Meta is the owner and operator of the Facebook and Instagram social media platforms. There are approximately 2.9 billion Facebook users worldwide, including 16 million in Australia. Instagram users globally number around 1.4 billion, with 9.5 million users locally.
Both platforms have attracted considerable criticism for misinformation, trolling, and defamatory content and the difficulties that arise when trying to remove that content.
With a federal election imminent, concerns are growing that online misinformation campaigns will increase. It’s something Mia Garlick says Meta is trying to counter.
“This is certainly something that we have been trying to evolve, both our policies, but also our tool offering, giving people in public life greater protections, and obviously if there are any serious concerns, we work closely with law enforcement to address that as well,” the Australian Broadcasting Corporation reports Ms Garlick saying.
“We have seen an increase in abusive commentary in relation to them that’s not really related to the issue of the day.”
Meta says it already prohibits the misrepresentation of election dates, locations, times, and methods of voting or voter registration. Also on Meta’s radar is inaccurate information about who can vote, how to vote, qualifications for voting and whether a vote will be counted.
These aren’t Australia-specific policies. Instead, they are global standards which Meta says reflects their commitment to voting and the democratic process.
As Meta’s public face in Australia, Mia Garlick is frequently called upon to defend the company and its failure to take down incorrect and offensive content. Prime Minister Scott Morrison has called social media a refuge for anonymous cowards who “can bully, harass and ruin lives without consequence”.”
The Australian Government wants to legislate to hold such people to account, forcing platforms like Facebook to hand over identifying details. But Meta has queried the effectiveness of the proposed legislation.
In response to incidents overseas, Meta recently added brigading and mass reporting to its list of serious online misbehaviours. Both online behaviours involve a collective push of incorrect, defamatory, or offensive information and come to the fore during electoral campaigns.
According to the ABC, Meta is drawing on its experience handling misinformation during electoral campaigns overseas to better manage content during the upcoming Australian campaign.
In addition to Meta’s own fact-checkers and content monitors, the company is funding third-party fact-checkers, including the Australian Associated Press and RMIT’s Fact Lab. But Meta won’t automatically take down all potentially misleading content during the election campaign.
Meta argues if the information is already published and widely disseminated, it is already under the microscope and, if incorrect, likely to be called out, saying;
“Inserting ourselves as part of that process, trying to arbitrate over the truthfulness of different political sides, is not an area where we think it’s appropriate for a company like us to have that particular role.”