By Rob Nicholls
We are part-way through the work of the Joint Select Committee on Social Media and Australian Society. The committee’s interim report was due on August 15, but has been delayed by the previous chair’s promotion to cabinet.
So how well is the federal government regulating social media companies? This report card focuses on news and dangerous or inappropriate content.
A mixed report card
There are two critical issues here. The first is whether the social media companies are assisting in their own regulation. The second is the extent to which they are meeting their (implied) social obligations.
An example is Meta (owner of Facebook) and the eSafety commissioner. The commissioner has asked social media businesses to find out just how many Australian children are on their platforms and what measures they have in place to enforce their own age limits. For most platforms, the age limit is 13.
Meta takes the view that parents should manage their children’s Meta accounts. From a regulatory perspective, the regulated business Meta has decided that other people (parents) should enforce the self-regulatory framework designed by Meta.
In the context of age verification, the government has signalled that Meta is unable to enforce its own rules and proposes to set a new minimum age. The details of this are still unclear.
At the same time, Meta is still giving evidence that it may block news content, as it has done in Canada, if it is forced to negotiate deals with news media businesses.
In the end, the News Media Bargaining Code has worked for three years by leveraging the risk of “designation”. The minister (usually the treasurer, but currently the assistant treasurer) may designate a digital platform business if that business has a bargaining power advantage over news media businesses, but is not making a significant contribution to the sustainability of the Australian news industry. Having survived withdrawing services in Canada, Meta now takes the view that the risk is substantially mitigated.
X: could do better
Although Meta pushes back against age-verification regulation, it is generally responsive to take-down notices. This is partly because it has a team in Australia to deal with those.
X Corp (formerly Twitter) does not. The primary reason that X was shut down in Brazil is that it did not have a lawyer on whom to serve notices.
X has little in the way of presence in Australia. Regulatory enforcement requires someone to be regulated. This is the primary blot on the report card for X. It’s really difficult to assess the effectiveness of regulation without the regulated business being present.
At the heart of the problem with regulating X Corp, regardless of the country in which the regulations are applied, is the unwillingness by the owner of that business to be regulated. Conflating the removal of inappropriate content with US-centric free-speech arguments is always going to be problematic outside of the US.
Good regulation relies on at least the tolerance of being regulated.
News: alternatives available
So, if the News Media Bargaining Code is not going to be a significant mechanism for funding public-interest journalism, there needs to be another solution. One approach is to impose a digital services tax.
However, this becomes risky if it looks like a tax that is selectively applied to specific international businesses. Australia has made commitments at the OECD on ways in which it will deal with profits diverted to low-taxing countries.
The University of Sydney has proposed an alternative approach to the joint select committee: to have an industry levy on a class of businesses that provide digital content services. This could ensure Australia’s international obligations in both tax and trade are not compromised by funding public-interest journalism.
Advertising issues
Meta has strong self-regulatory policies on advertising crypto products and services. However, the Australian Competition and Consumer Commission (ACCC) has alleged that more than half of crypto ads on Facebook are scams. Given that scams are a significant problem in Australia, it’s not surprising all of the relevant regulators are concerned about this issue.
Perhaps this is one of the most important aspects of the regulatory report card. There are four relevant regulators in Australia. These are the ACCC, the Australian Communications and Media Authority (ACMA), the Office of the Australian Information Commissioner (OAIC) and the eSafety Commissioner. Together, they form an important, but unfunded, group called DP-REG.
This group focuses on getting regulatory coherence and clarity. It also assesses and responds to the benefits, risks and harms of technology. That is, it forms the basis for the development of stronger and multilateral regulatory responses to social media issues.
The group has the potential to look at how money flows as well as content. However, co-ordination is much easier with appropriate funding.
A coherent approach from these regulators offers the best possible potential for an improved regulatory report card.
Rob Nicholls, Senior Research Associate in Media and Communications, University of Sydney
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Support Our Journalism
Global Indian Diaspora and Australia’s multicultural communities need fair, non-hyphenated, and questioning journalism, packed with on-ground reporting. The Australia Today – with exceptional reporters, columnists, and editors – is doing just that. Sustaining this needs support from wonderful readers like you.
Whether you live in Australia, the United Kingdom, Canada, the United States of America, or India you can take a paid subscription by clicking Patreon and support honest and fearless journalism. LINK: https://tinyurl.com/TheAusToday