As the next federal election looms, Australia is not immune to the unprecedented power of Big Tech.
Facebook has already demonstrated how readily it will wield its power over Australia's access to information. Without a second thought, it "turned off the news" during a global pandemic and in peak bushfire season.
We also know Facebook failed to remove the bogus "death tax" claims during the last federal election, even though its own independent fact-checking process found the material to be false.
And we continue to watch COVID-19 misinformation on the platform grow, despite Facebook's assurances it is taking action to remove content that violates its rules.
As we seek to ensure the integrity of our election, we should keep in mind a major insight from the powerful testimony of the senior Facebook executive turned whistleblower, Frances Haugen, to the US Congress last week.
She finally put to rest a significant myth about the tech sector - that Big Tech companies such as Facebook can regulate themselves.
Facebook knows content that elicits an extreme reaction from you is more likely to get a click, a comment or a reshare, and is therefore more profitable for the company.
The platform uses what is known as "engagement-based ranking", where algorithms guide users towards less nuanced, more extreme content, to keep users on the platform for the purposes of ongoing value extraction.
This is the fundamental tension between what is best for the company and what is best for the public. It is unsurprising that Facebook persistently places "profits before people", as described by Haugen.
Given what is at stake, we can no longer accept the laughable self-regulatory mechanisms such as the Australian Code of Practice on Disinformation and Misinformation.
The code, developed and launched by industry not-for-profit DIGI earlier this year, is voluntary and opt-in, with no enforcement and no penalties. Facebook has opted into every commitment, and yet its 50-page transparency report reveals little about features that have substantive impacts on our digital public square and our democracy.
For example, how do Facebook's algorithms rank content? Why are Facebook's AI-based content moderation systems so ineffective? Is it true, as Haugen stated, that only 10 to 20 per cent of hate speech is removed? What proportion of violating accounts, pages and groups are taken down? Who in Australian politics is on Facebook's "whitelist", a list of high-profile users who seemingly exempt from moderation?
DIGI has announced that its code has now been bolstered by an oversight board and a public complaint faculty. Yet, these measures do not remedy the fundamental information asymmetry that Haugen has now revealed. The public cannot make meaningful complaints without a clearer view of how Facebook actually operates.
We need Facebook to release data, including about its algorithms, to an independent oversight body that works with academics and other agencies to conduct research and enforce compliance, with penalties where required.
For the coming federal election, Reset Australia has proposed that digital platforms at the very least should make available a "live list" of election related mis/disinformation trending on social media, through a queryable database. The administration of the live list could sit under the Electoral Integrity Assurance Taskforce, which is made up of officers from a range of bodies including the AEC.
Approaches centred around increased content moderation are not systemic solutions, nor are they commensurate with the scale of the problem at hand.
Increased transparency through proper regulation is in the interests of those of all political persuasions, and is a critical first step in understanding the problem in greater depth and finding evidenced-based solutions.
As Haugen told Congress: "When we realised Big Tobacco was hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. When we learned opioids were taking lives, the government took action. I implore you to do the same here."
It is not too late for the Australian government to take action - for the sake of our privacy, our democracy, and our public safety. Strong government action, rather than self-regulation, is the way forward.
- Dhakshayini Sooriyakumaran is director of technology policy at Reset Australia and a PhD candidate at the ANU's School of Regulation and Global Governance.