Treated like children

March 11, 2021; Publisher: ; Format:

As published in The Saturday Paper March 11 2021

Public policy should not be left to the cult of personality, no matter how amicable the current eSafety Commissioner may be. The best protection against unintended consequences is for this legislation to be surgically precise in outlining the powers that the office holds and how it may exercise them.

The Online Safety Bill was introduced by the Morrison government last month with much fanfare about its mission to improve and promote the online safety of Australians. A lot of analysis of the bill has focused on its attempt to minimise harm to children online and to restrict the sharing of non-consensual images across digital platforms.

The bill aims to do this by providing several new powers to the eSafety Commissioner, an office that was established in 2015 to protect and promote online safety for children. So far the bill sounds uncomplicated, noble and necessary. Indeed, protecting children and ensuring they can engage online safely is imperative.

The pandemic has shown just how much young people are dependent on the infrastructure of the internet for accessing basic services, including education. Yet the Online Safety Bill goes far beyond the bounds of what’s necessary to ensure the online safety of children. This should come as no surprise, as the mandate of the office of the eSafety Commissioner has been growing substantially since its establishment. But what might be appropriate for protecting children can have very different effects when it comes to the rights of adults online.

The rhetoric about safety too easily masks the dangers of wide-ranging powers without accountability.Broadly, there are several key components of the bill. There is a cyber-bullying scheme, which would allow for the removal of material that is harmful to children; an adult cyber-abuse scheme, to remove material that seriously harms adults; and an image-based abuse scheme, to remove intimate images that have been shared without consent.

These areas focus on creating pathways of redress for children and adults suffering online bullying, abuse and the non-consensual sharing of intimate images. Each is a valid concern – as such online issues can translate to significant real-life harms. While there is some small room for improvement, the powers are mostly responsive to complaints made by those harmed, which makes them appropriate and justifiable.But other parts of the bill are of great concern.

The bill introduces something called the “basic online safety expectations”, which will allow the eSafety Commissioner to hold services accountable to wholesale industry standards and requirements. The bill also includes an online content scheme, for the removal of “harmful” material through takedown powers. Finally, the bill establishes an abhorrent violent material blocking scheme, to allow the commissioner to block websites hosting material deemed too violent.

These parts of the bill contain largely proactive powers, meaning it will be at the discretion of the eSafety Commissioner to search the internet for content that would be in breach. By establishing a set of expectations for platforms and services, they will also be encouraged to police themselves. There are very few avenues for appeal, either against a decision made by the commissioner or a platform. The commissioner potentially becomes the internet cop-at-large. In essence, the proposal gives a shocking amount of discretionary powers to an administrative official to determine what content adult Australians may or may not view and interact with.For scope, the bill relies heavily on the National Classification Code to determine which content may be issued with a removal notice.

But the code has long been criticised for being outdated and overly broad. In general, class 1 covers content that would be deemed “Refused Classification” (RC). This includes content that deals with sex or “revolting or abhorrent phenomena” in a way that offends against the standards of “morality, decency and propriety generally accepted by reasonable adults”. Class 2 material includes content that is likely to be classified as X18+ or R18+. This includes non-violent sexual activity, or anything that is “unsuitable for a minor to see.” Under the Online Safety Bill, anything that falls under either of these classes of material can be subject to removal notices by the eSafety Commissioner.

Most obviously, this would cover sexual content that many Australians engage with regularly, and consensually. It would also likely cover content that might be used for political accountability or even satire. It is easy to imagine how video content of violent misconduct by police might well be subject to the commission’s powers, even if this is not the stated intention of the proposal. There is historical evidence from around the world that shows videos of human rights abuses and documentations of state abuse of power are the first to go under powers such as those proposed by the bill.

Perhaps most troubling, the measures create a chilling effect for digital platforms, which will likely elect to pre-emptively take down content, rather than wait for the commissioner to demand its removal. There have been some media reports covering statements by the Communications minister, Paul Fletcher, and the eSafety Commissioner stating that the provisions of this bill would not be used in the ways human rights groups fear – that is, to repress freedom of expression, to deplatform legal sex work or stifle political discourse. However, some of these statements are in direct contradiction to the intentions set forth by the explanatory memoranda.

Laws that are made can be used, and critics of this bill have every right to assume as much. Public policy should not be left to the cult of personality, no matter how amicable the current eSafety Commissioner may be. The best protection against unintended consequences is for this legislation to be surgically precise in outlining the powers that the office holds and how it may exercise them. And this must be underpinned by proper oversight and accountability.

Anything less will mean that the regime contained in the Online Safety Bill has enormous potential to limit our public discussions, our capacity to hold the powerful to account and our right to use the web with autonomy. For those concerned by this proposal, the consultation process around this bill gives little cause for comfort.

The draft bill went from infancy to being tabled in parliament in a matter of weeks. The consultation by the Department of Communications was open for only a month and closed with a whopping 376 submissions from a wide range of interested parties. Yet the bill went on to be tabled in parliament 10 days later with no significant amendments. The failure to contend with significant public concern suggests a deliberate effort to evade debate and public scrutiny.

There can be no reasonable expectation that the consultation done by the government was thorough or adequately addressed the concerns raised by the stakeholders.The parliamentary process has been equally rushed. The senate committee on communications solicited submissions on the draft bill for three business days, followed by a public hearing two days later and a final report by the committee within a week.

For perspective, such a process normally takes several weeks, if not months, depending on the parliamentary schedule and urgency of the legislation.This approach to process illustrates that the government views its mandate in regulating content on the internet as absolute.

Rather than leading a nuanced and necessary debate about the responsibilities of online autonomy, this proposal shows how little the government thinks of the general public and its responsibility towards it. The rhetoric and rushing betrays an understanding that there is an absence of social licence for such measures. The fact that the government is persevering in the face of serious concerns from the community shows that it thinks we can be treated like children.