Posted: 2022-06-05 19:00:00

“We have had an independent expert develop best practice reporting guidelines which have driven significant improvements in the second set of transparency reports, including the release of more Australian data,” DIGI’s managing director Sunita Bose said. “The code’s signatories all offer very different products, so being able to draw meaningful comparisons between different digital platforms will always be a challenging task.

“The guidelines focus on encouraging signatories to drive improvements within their services over time, and the public release of annual transparency reports provides accountability in that effort.”

Facebook’s disclosure was noted in a section of its report on “co-ordinated inauthentic behaviour”, which is “typically designed to mislead people about who is behind an operation to manipulate public debate for a strategic goal.”

Sunita Bose represents the big social media and technology firms. She argues the code has increased transparency.

Sunita Bose represents the big social media and technology firms. She argues the code has increased transparency.Credit:Edwina Pickles

Reports from other companies, which are a product of a voluntary misinformation code adopted by the industry, show the scale of false coronavirus claims circulating online.

Video social media network TikTok, for example, disclosed a rapid ramp-up in its take-downs of Australian medical misinformation during the coronavirus pandemic, peaking at almost 4500 videos removed in the month of September 2021.

Between January and June last year, Twitter removed 1028 posts from Australia with COVID-19 misinformation and suspended 35 local accounts. YouTube removed about 5000 videos that violated its rules on dangerous or misleading COVID-19 content.

Loading

But these figures generally do not show how many people saw the content before it was pulled, how quickly it was identified, how much more was reported but not pulled or what exactly was removed to allow debate on each company’s moderation systems.

Jake Wallis, who heads the Australian Strategic Policy Institute’s disinformation program, said the voluntary transparency reporting process was a good initial step but challenges remained. Dr Wallis and colleagues at ASPI’s cyber policy centre, which is sponsored by firms including Google and Meta, revealed the financially motivated misinformation campaign that hit the 2019 federal election.

“The metrics are ambiguous and difficult for both government and industry to measure performance against,” said Wallis. “How much content removed constitutes the right amount? How does industry define Australian-specific performance metrics on platforms across which flow transnational networks?”

Chris Cooper, executive director of Reset Australia, an advocacy group critical of the technology giants, is scathing of the transparency reports.

“Many of the figures and facts outlined in the report are designed to sound impressive, but they lack transparency and meaningful context,” said Cooper, whose organisation receives support from eBay founder Pierre Omidyar’s foundation and progressive consultancy Purpose.

“Ultimately, we are left beholden to offshore, digital behemoths to decide how they’ll manage the threat. Which is a level of self-regulation that we accept from no other industry.”

By contrast, the libertarian Institute of Public Affairs, has inveighed against proposals to let the ACMA regulate misinformation, saying it risked turning the authority into “Canberra’s thought police”.

In a letter to MPs sent last month, the IPA argued that the government should not be in the business of deciding whether things like political claims constituted “misinformation” and that it should be up to the public to make up their own mind.

Loading

“The idea that debate needs to be suppressed to protect against harms to ‘democratic processes’ is a draconian and arrogant assumption that belongs in George Orwell’s Nineteen Eighty-Four, not in a genuine liberal democratic society,” IPA legal rights program director Morgan Begg wrote.

In a survey released by DIGI, the research firm Resolve Strategic asked Australians about their views of misinformation and found little consensus across people with different political affinities about what fit the term. In one example, survey respondents split when asked about a report in The Guardian about the catastrophic effects of climate change.

DIGI’s code was strengthened last October by an independent board to oversee guidelines and handle material breaches conduct. An independent expert, Hal Crawford, was hired to fact-check the transparency reports.

In a statement sent by DIGI, Crawford said the 2022 reports were an iterative step on from last year’s. In addition to the Australian data, the reports set out steps like fact-checking, information centres, and warning notes that the tech giants have implemented. They also contain extensive global statistics.

DIGI now plans to take it one step further. It is seeking feedback from academics and the public on ways to improve the code. Meanwhile, the government is considering the best way to tackle the issue.

Newly appointed Communications Minister Michelle Rowland accused the previous Coalition government of waiting years to start scoping out powers for the Australian Communications and Media Authority (ACMA) to address misinformation. “The regulator doesn’t have the power to investigate or compel information from digital platforms about how they manage misinformation and disinformation in Australia, so there is an important role for regulation in this area,” Rowland told The Sydney Morning Herald and The Age.

Former communications minister Paul Fletcher unveiled plans in March to introduce laws that would give the Australian Communications and Media Authority more power to discipline tech companies that failed to meet the standards of their voluntary code. Rowland has not explicitly said whether she will give the ACMA these information-gathering powers, which would allow it to legally request tech platforms to hand over information about complaints handling, issues they are being acted on and engagement with harmful content.

In a statement, a spokesman for the authority noted it had asked the government for reserve powers to enforce compliance with industry codes almost a year ago.

The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above