Why is reform of data privacy law slow and contested?
Reform of data privacy law is difficult and contested.
There are a various not-so-good reasons why it has taken the Australian Government over two decades into the 21st century to start serious discussion about making the Australian Privacy Act fit for purpose for this century. The most important not-so-good reason is that that not enough politicians, businesses and government agencies take data privacy seriously.
The zeitgeist, and perceptions as to urgency, are changing.
Applications of artificial intelligence and new surveillance technologies are creating global alarm.
Businesses that are now being forced to undertake expensive retrofitting of better data privacy practices into their existing business models and data architectures are learning the expensive lesson that retrofits hurt.
Some policymakers now see political advantage in being seen to be tough on ‘global digital giants’ and ‘data brokers’.
However, there remains one good reason why reform of data privacy law is slow. Creating a revised, balance and workable data privacy statute is harder than it first appears. This is particularly the case because many concerns relating to use of consumer data spill over into faster changing areas of competition regulation and consumer protection law, and the enforcement remit of the comparatively well-resourced ACCC.
Some privacy advocates argue that the principal problem is lack of enforcement by privacy regulator, and not the coverage or content of data privacy law. Many regulated entities have not invested in invested in good data privacy practices, creating a significant gap between what they say and what they do, or what they (reliably and verifiably) don’t do. Other entities adopt ‘tick-the-box’ strategy, treating compliance as an exercise in form over substance, and provide ‘transparency’ through buried and opaque privacy policy ‘disclosures’ of their privacy affecting acts and practices.
Consumer organisations and privacy advocates express concern that the privacy regulator is not funded sufficiently to conduct investigations and bring legal proceedings that could hold to account those regulated entities who don’t do what they say (“we take your privacy very seriously”) in their privacy disclosures. Many data protection regulators are under-resourced, so their enforcement action is selective. Regulators have also been required to divert limited resources to address year on year increases in the number and complexity of data breaches and to investigating and addressing an ever-increasing variety of concerns as to data handling practices of businesses.
However, there is growing consensus that data privacy statutes require a substantial overhaul. A key issue is addressing ‘the illusion of choice’. Regulation today focusses upon creating transparency (through privacy disclosures) for citizens, to enable citizens to exercise choice. As we all know, choice is often impractical and usually made difficult to exercise. Notice and consent fatigue is real: is there anyone who doesn’t suffer from it? But if we shift the focus of regulation away from consumer choice and towards requiring more responsible practices and greater accountability of regulated entities, how does the statute ensure that regulated entities have sufficient certainty (clarity and predictability) as to ‘what good practice looks like’?
We don’t want to carry forward the expensive retrofit problem by requiring entities to make (even well-advised) guesses as to what “fair and reasonable” means, to then discover a regulator or a court takes a fundamentally different view. Judges regularly disagree as to whether a particular contract term is “unfair”, even though that is much easier than discharging a positive obligation to be fair and reasonable . Should regulated entities be expected to work out what is “fair and reasonable”, at least in the absence of well-developed guidance and guardrails?
Another reason why reform of data privacy law is often highly contested is that often the issue in debate is not clearly within the frame of privacy law.
Sometimes the real issue is who should control data about consumers. Many individuals want a right to say ‘no’ in relation to a broad range of that they see as impinging upon their movements and activities being unobserved. They seek more than ability to opt-out from targeted advertising. Some individuals consider that entities monetising consumer data, particularly through use of that data by third parties, should share financial benefits with them, and not just by provision of ‘free’ services. Some citizens worry that entities may be responsible in what the entity does, but ‘turn a blind eye’ to what other entities within multiparty data ecosystems do.
Contest as to the scope of data privacy law now involves discussions of organisational social responsibility, ethics and social licence, erosion of digital trust, enabling digital inclusion, enabling societally beneficial uses of data, addressing the emerging pantechnicon of surveillance and ‘profiling’, biased and discriminatory algorithms and AI, online security and safety and protection of children and other vulnerable people.
Law reform is never easy. Scoping and devising appropriate reforms of data privacy law is particularly difficult. We can’t sensibly expect otherwise.
Peter Leonard
December 2021
Copyright © Peter Leonard (Data Synergies) 2021.
Peter Leonard is a business consultant and lawyer advising data-driven businesses and government agencies. Peter is principal of Data Synergies and a Professor of Practice at UNSW Business School (Information Systems and Technology Management, and Management and Governance). Peter is immediate past chair of the AI Ethics Technical Committee of the Australian Computer Society and the Privacy and Data Committee of the Law Society of New South Wales’. Peter was a founding partner of Gilbert + Tobin and is now a consultant to that firm. He serves on the NSW Government’s AI Review Committee and Information and Privacy Advisory Committee, and a number of corporate and advisory boards.