Australia’s privacy regime is weak compared to most other major economies. Yet on Friday, an Australian court ordered Google to pay AUD 60 million (£35 million) in fines for its “deceptive” data collection practices.


The case reiterates that even where data protection law would fail to prevent privacy violations, regulators can often still find a way to sanction businesses, and serves as a reminder of the importance of transparency and a comprehensive approach to governance, risk and compliance.


Google’s fine originated with a court filing from the Australian Competition and Consumer Commission (ACCC) in October 2019 that accused Google of violating the Australian Consumer Law (ACL).

The ACL sets rules around misleading, deceptive or unconscionable conduct, unfair contract terms and unfair practices. 

The ACCC argued that, between January 2017 and December 2018, Google provided misleading information during its account setup process about the collection of location data from Android devices.

In April 2021, the Australian Federal Court ruled in favour of the ACCC regarding these allegations about the setup process.

“The Court ruled that when consumers created a new Google Account… Google misrepresented that the ‘Location History’ setting was the only Google Account setting that affected whether Google collected, kept or used personally identifiable data about their location,” said an April 2021 ACCC press release

“In fact, another Google Account setting titled ‘Web & App Activity’ also enabled Google to collect, store and use personally identifiable location data when it was turned on, and that setting was turned on by default.” 

Friday’s court ruling confirmed that the regulator’s £35 million fine was “fair and reasonable”.

What About Privacy Law? 

As noted, despite the privacy violations that occurred in this case, the fine was issued by a competition and consumer rights regulator.

Google’s byzantine account setup process has been subject to enforcement action before—most notably in France, where the country’s data protection authority, the CNIL, issued a EUR 50 million (£42 million) fine in 2018.

The CNIL found that Google’s account creation process violated the GDPR due to a “lack of transparency, inadequate information and lack of valid consent”.

So, given that the issues at hand were so similar—why did France fine Google under data protection law while Australia chose antitrust as its weapon of choice? 

Australia does have a privacy law: the Privacy Act 1988, which is enforced by the Office of the Australian Information Commissioner (OAIC). The law is relatively limited in scope, exempting most small businesses and centring around applying the “Australian privacy principles”.

Although the Privacy Act is a somewhat outdated and light-touch statute, Google’s conduct may arguably have violated the law, which requires that individuals are sufficiently informed about the purposes for which their personal information will be used before providing consent.

However, fines under the Privacy Act max out at AUD 2.1 million (£1.2 million). An ongoing consultation suggests increasing the maximum penalty—but even under these proposals, the fine ceiling would only reach AUD 10 million (£5.8 million).

The FTC vs Facebook

Australia’s £53 million fine is not the first time that consumer rights law has been used to sanction a company accused of violating people’s privacy.

A high-profile example is the US Federal Trade Commission (FTC)’s action against Facebook following the Cambridge Analytica scandal, which resulted in a USD 5 billion (£4.1 billion) fine in July 2019.

“The $5 billion penalty against Facebook is the largest ever imposed on any company for violating consumers’ privacy,” the FTC said in a press release

But it wasn’t a privacy law that empowered the FTC to recover such a large amount from Facebook.

Instead, the FTC fined Facebook under the so-called “FTC Act” (15 U.S. Code § 45), which, much like Australia’s ACL, forbids “unfair or deceptive” trade practices—but does not mention privacy.

The central allegation against Facebook was that the company had made false promises to users about how their personal information would be shared—not that it had allowed their privacy to be violated per se.

The FTC also found that Facebook had violated a 2012 FTC order requiring the company to stick to its terms of service regarding the use of and access to user data.

There arguably remains no federal privacy law that would have prohibited the privacy violations enabled by Facebook during the Cambridge Analytica incident.

America’s patchwork of state laws (none of which were in force at the time), data breach notification requirements and sector-specific rules would not have been an adequate mechanism to sanction Facebook for its failure to protect its users’ data.

But despite this lack of privacy protection, Facebook’s FTC settlement is the largest sum paid out to a regulator in a case concerning privacy. It far surpasses even the largest GDPR fine on record, EUR 746 million (£618 million), imposed by Luxembourg’s regulator against Amazon last year.

Where Privacy Law Fails, Consumer Law Steps In

The Australian case against Google shares some commonalities with the FTC’s settlement with Facebook:

  • Both actions proceeded under consumer protection law—not privacy or data protection law

  • Despite this, both Google and Facebook were accused of violating their users’ privacy by the respective regulators

  • Both cases took place in jurisdictions with relatively weak privacy and data protection regimes that would arguably have failed to provide a suitable basis for enforcement against these privacy violations

  • Both regulators took aim at transparency failings and deceptive practices—rather than enforcing rules on privacy per se

In the EU, the GDPR and ePrivacy Directive provide an adequate legal framework for regulators to pursue violations of user privacy.

But even when operating in countries without such strong data protection regimes, enforcement risks remain if businesses act deceptively or fail to treat user data fairly.