President's reflection

How should we regulate digital technology?

19 November 2020

Professor Hugh Bradlow FTSE

Professor Hugh Bradlow FTSE

President of the Australian Academy of Technology and Engineering

Over the years ATSE has brought the technological and engineering wisdom of its Fellowship to bear on various aspects of regulation involving technology: energy, water, food, genomics, minerals and many others.

As a Fellow, I have from time to time noted an issue upon which we are silent, but where I believe we can and ought to add valuable advice to the public or political discourse.

One such area, in which we have thus far been silent, is on the issue of regulating the digital technology that underpins our 21st century society. I believe that the time has come for us to dive into these murky waters, as governments around the world look into tighter legislative and regulatory frameworks for “Big Tech”, including social media, search, online shopping and user generated content.

Our own government is in the process of introducing a bill which aims to more tightly regulate big tech. My personal view is that the bill as it is currently drafted could prove to be contrary to the interests of ordinary Australians. The proposal to require Google and Facebook (amongst others) to pay newspaper publishers for the stories that they link to may unfortunately reduce the breadth of reliable news and information available via social platforms.

Concerningly, research has shown that conspiracy theorists tend to get more of their news from other users of social media than from mainstream media (e.g. see this Pew research) – underpinning the importance of a breadth of perspectives and sources on these platforms.

Legislation and regulation typically lag behind technical and scientific capability, and it’s important that governments heed the advice of technical as well as legal and social experts when attempting to create new parameters for the use of technology. I encourage Fellows of the Academy – as experts and leaders in the field – to closely examine both the intended and the unintended consequences of proposed changes to regulations of technology, and to provide input and advice to guide it towards the best possible framework for the technological and social situation as it stands now, and as it is likely to stand in the near future.

This is an area where governments all over the world struggle as law makers usually lack the technological skills to explore those unintended consequences. In my view – with a mission to help Australians understand and use technology to the benefit of society – it is our duty to help governments properly understand technology and its real and potential impacts.

Let me illustrate with a couple of examples: Social media companies have been able, thus far, to identify as “common platforms”. This is a notion that originates in the telecommunications industry and through which the telecommunications carrier is not deemed liable for the information that it transmits across its networks. If you and I conduct seditious speech on our mobile phones, the telco is not liable for our crimes.

As a result of this common platform assumption, social media companies can legally allow lies, hate speech, foreign interference, conspiracies, etc to propagate across their platforms. In theory the person who posts the information is liable but in effect the capacity to prosecute such perpetrators is virtually non-existent (e.g. how do you prosecute Russian troll farms).

One answer could be to treat social platforms such as Facebook, YouTube and Twitter as media companies, which would expose them to the same legislative and regulatory restrictions that cover news media companies. Applying these restrictions would be complicated, in part because, in some respects, social media companies do look like telecommunications companies via their messaging apps. ATSE could assist in identifying not only appropriate potential controls, but the means to apply them. My personal suggestion would be to use the Dunbar Number as a breakpoint requiring platforms to be treated as media companies for any open audience over 150 people.

The second example is ownership and use of people’s personal information. The data we as consumers and users hand over is what enables search and social media platforms to anticipate our actions, curate content to our interests, and target tailor-made advertising. As consumers, we generally like this as it makes content and search results pointed and relevant. However, the question needs to be asked as to who controls this data – the company or the user? We need regulation to address this both in relation to competition, and to privacy.

In addition, our personal data raises another complex issue. It is possible to use this data to infer psychographic profiles of users, which can then feed algorithms that define the content that is pushed to individuals, thereby leading to confirmation bubbles. These open the door to potential manipulation and the explosion of conspiracy theories. As challenging as it seems, I believe well drafted legislation could make these algorithms accountable. As engineers we know that you control systems via measurement; one approach could be to establish metrics to measure the behaviour of the algorithms and a requirement to report against these measurements, thereby empowering citizens to understand whether they are being manipulated.

Digital technology regulation is a complex and challenging area of still emerging and evolving law – these are vital discussions we need to have with our government and the public if we are to see Australia move forward on a fair and progressive basis. I believe that ATSE should get involved because we have the capacity to act as informed and unbiased participants, something which I believe would be of immense value to the government as it navigates these difficult waters.