Photo of a neon sign showing a love heart and the number zero

Last Friday was a good day for Google and Facebook. It was the day the Australian Competition & Consumer Commission (ACCC) released its report into digital platforms and their effect on traditional media.

While the ACCC was widely praised for taking a “tough position”, the report is largely good news for Google and Facebook, because it fundamentally reaffirms and cements their existing business models.

No doubt, the ACCC needs to be commended for its diligent and comprehensive analysis of Google’s and Facebook’s dominance of the online advertising market and their influence on the media industry. Its 613-page report is certainly worth reading.

Along with analysis, it provides important recommendations to promote competition, support the news media impacted by digital platforms, deal with the spread of fake news, and strengthen privacy protections.

But the real concern here is not just with the need to curb the impact of these giants on the media and in the marketplace but rather with the very business models that have led to their enormous success. That is, the ways in which the digital platforms organise their income generation, through targeted advertising made possible by the sale of their users’ privacy. Unfortunately, the recommendations amount to treating the symptoms by accepting the existence of the underlying condition.

In the public eye, Facebook is a social network; Google a search company. They are not advertising businesses; this is also how the companies like to portray themselves. And it is worth remembering this is how both started out, by addressing important internet challenges for their users.

But in a quest to make money, both companies increasingly came to monetise data on their users’ preferences, collected initially as a by-product of their services, to offer targeted advertising at a level of detail unheard of in traditional media.

Over time this data collection came to dominate platform design, enticing users to spend more time and give up more data. As a result, the distribution of user content, and increasingly journalistic content, became a means of fuelling data collection, optimising advertising income. What drives the design of the services, and what is a by-product, reversed.

Unfortunately, the ACCC’s recommendations accept those business models, rather than challenging them. They accept both tech giants are advertising businesses, both appropriate journalistic content for their own gain, they trade in user privacy, and their algorithms tend to spread misinformation. By proposing measures to monitor and curb each one of those, the ACCC tacitly accepts the existence of each, lending legitimacy to the business models that bring about those issues.

This is good news for the platforms, because they can now deal with regulation on their own terms. Indeed, it is the very reason Mark Zuckerberg has repeatedly asked for Facebook to be regulated, to avoid more fundamental interference with its business model.

And yet, there are more far-reaching proposals under serious consideration overseas, such as calls by US presidential hopeful Elizabeth Warren and others to break up big tech companies (which might be neither practical nor likely). There have also been demands for the repeal of Section 230 of the Communications Decency Act of 1996, which absolves a digital platform of responsibility for content posted by its users. Such a move would fundamentally challenge the algorithmic nature of content distribution on these platforms. Importantly, such proposals ask serious questions about the business models themselves.

Despite the valuable analysis by the ACCC, there is still not much public awareness about how digital platforms operate. This is a significant problem. Because it is this very nature of their operations that brings about the problems we are trying to solve.

Without foregrounding these fundamental issues, nothing will change. The public will be appeased by what is proposed; the platforms can point to the regulator when something goes wrong and argue they are being watched. Yet their advertising dominance will keep growing, and with it the adverse effects on information sharing and privacy invasion despite better, taxpayer-funded (and likely increasingly expensive) processes for monitoring them.

Users will not abandon the platforms. Raising awareness about the symptoms, such as privacy breaches, did not make much difference in the past. Very few users change privacy settings, and only a tiny minority quit Facebook in the wake of the Cambridge Analytica scandal.

Two things need to happen. First, public discourse must take account of the true nature of digital platforms. More awareness will provide lawmakers with a mandate to make fundamental changes. To be fair, such changes are beyond any Australian regulator.

Second, meaningful changes need to focus on the inner workings of these platforms. Once governments realise that various digital platforms are now akin to other infrastructures essential for our society, doors might open to allow more fundamental changes.

We need to rebalance the value that derives from these platforms for all stakeholders — users, advertisers, media companies and the wider society for which these platforms provide a vital infrastructure.

You can listen to Sandra and Kai talk about #BreakUpBigTech on The Future, This Week


This article was originally published in The Australian. Read the original article.

Note: rights to this article belong to the original publisher, The Australian. please contact them for permission to republish.

Related content