Member Policy and Privacy Briefing Jan 2021 with Peter Leonard

On January 28, 2021 Policy and Regulation

The misunderstood and fraught world of regulating adtech

Australian Privacy Act Review – Phase 1 Submissions

The Australian Attorney General’s Department has quietly conducted its first phase of consultation on its low-key Review of the Privacy Act 1988. We won’t know for some time whether submissions received by the Review Team will have any real impact on the policy makers’ thinking. We do now know that over 140 submissions were received and have been published: a significant further number await publication, or were made confidentially. This is a rich haul for such a low-key consultation, particularly as the public consultation phase had a very short consultation period and was at the busiest time of the year. The Review Team now has several thousand pages of comments that range across the 68 questions on which comments were invited. The views of the ACCC may still drive the policy makers’ agenda, but we now have a rich diversity of other views. Whatever the Review Team wished or hoped for, they now have a diverse fare of food for thought.

It is impossible in this short column to do justice to over 140 different views ranging over 68 topics. In any event, I’m not paid to do that arduous task, which is why someone else (like the would be policymaker) needs to do it.

What can be said is that many submissions assert that there is rampant overuse of consumer data in the digital advertising services sector.

Many submitters believe this concern to be endemic within the data ecosystems that enable targeted digital advertising services.

Many submissions join the ACCC’s advocacy of extension of the legal definition of personal information about individuals to include so-called ‘technical information’ about devices and browsers.

Many submissions appear to believe that if there is a remote possibility that an individual might be identifiable in any way through association with use of a particular device or browser, then the potentially identifying information should be regulated personal information about an individual, regardless of controls and safeguards that data processors may deploy to mitigate risks of reidentification of relevant individuals.

Many submissions appear to assume that welfare of consumers and consumer benefit will be maximised by broadening the range of information within a revised definition of ‘personal information’. But this assumption may well be wrong. Broadening the range of information captured by a revised definition of ‘personal information’ would fundamentally change market dynamics in the digital advertising sector. This is likely to promote further concentration of data power in the hands of those market participants that most benefit from their ability to control particular data ecosystems and limit access to that data by other entities, regardless of how well the respective entities mitigate risks of reidentification of relevant individuals.

Perhaps most importantly for the digital advertising sector, reading these submissions confirms that there are widely held misconceptions about the nature and range of consumer data used in digital advertising services, and as to the ‘identifiability’ of humans through tracking of devices and browsers in the data ecosystems that enable targeted digital advertising services. Those misconceptions abound in many otherwise thoughtful submissions. As a result, there is very little discussion about the role for controls and safeguards in mitigating and managing risks of identification of relevant individuals through tracking of devices and browsers in the data ecosystems that enable targeted digital advertising services.

In the (current) Australian regulatory analysis, device identifiers and browser related online tracking code can be personal information about identifiable individuals in the hands of a particular entity, regardless of whether that entity uses those ‘identifiers’ for adtech. It is the capability of an entity to associate data with the identity of relevant individuals, and not an entity’s intent or actual business practice, that is relevant.

 Provision to a recipient of the relevant online tracking code or device identifier is not disclosure of personal information about an (identifiable) individual, or collection by the recipient of personal information about an (identifiable) individual, if the recipient does not have the capability to associate an online tracking code or device identifier with an identifiable individual. Capability is to be assessed taking into account relevant controls and safeguards and levels of reidentification risk. Facebook Custom Audiences is built upon a data architecture using anonymisation servers, whereby targeted ads are served to audience segments allegedly without knowledge of either Facebook, or the advertising audience segment creator and user, as to the identity of the individual Facebook users to whom the targeted advertising is addressed. Controls and safeguards really should matter, and already do matter.

Unless controls, safeguards and sensible and objective evaluation of their likely efficacy now becomes part of this policy debate, policymakers will likely conclude that that controls and safeguards are not particularly relevant, or reliable. In particular, policymakers may conclude that new regulation should include new prohibitions, and need not be designed to promote deployment of controls and safeguards that verifiably limit which participant in a digital advertising data ecosystem shares which categories of information with which other participants in that data ecosystem. If the debate continues to move in that direction, expect to see more intrusive regulation, a lot less players in the digital advertising sector, and further concentration of data power.

 

Review Process

Engaging with law reform can be much slower and less rewarding than watching paint dry.

When you brush or roll paint onto a wall and then watch it dry, you have the sense of reward for honest toil. You also leave your enduring mark – even when you chose the wrong colour.

Law reform bodies that call for submissions from the public in response to proposals sometimes really do wish to hear from submitters. Of course, they will always say that they do. However, often a public consultation phase is seen by policy makers as hurdle that they must jump over, before they go on to do what they always intended to do. Going to the trouble of putting in a submission can feel like starting a conversation with your MP at the sausage sizzle. Your views are politely heard, then disappear without trace.

Compared to our democratic nation peers, Australian regulators and policymakers do not score well in the public consultation stakes. In some peers, it is standard practice for a government entity seeking submissions to take active steps to educate potentially interested submitters as to why they may be interested in making a submission, to allow a reasonable time for submissions to be made, and for policymakers to publish a useful summary of the various themes that emerged in submissions received, before the policymakers move on to their next phase of development of policy.

Publication of ‘what we heard’ summaries are particularly valuable when (as is now common) submissions by various parties as put together run to thousands of pages. Policymakers derive the benefit of a free education from the hard work of submitters. They should share that free gift by closing the knowledge gap across the public, even if that means giving some oxygen to views that the policymakers do not endorse. Yes, many submissions are self-serving corporate bumph, a waste of electrons. Yes, some ‘privacy advocates’ seem to consider that data privacy is a human right somehow more important than human rights to public health and safety. But many submissions in most public consultations include insights that are valuable fruit of unpaid toil by citizens. In any event, reflection back demonstrates good social skills in action – just like we as parents reflect back to the kids whatever it was the kids had to say, before going ahead to do whatever we as parents decree is right to do.

Publication of ‘what we heard’ summaries is rare in Australia. Such summaries are common in peer nations, including Canada and the United Kingdom. The UK Information Commissioner’s Office does them particularly well. Adtech providers do not like much of what the UK ICO has to say, but at least they know that they were heard. Some Australian government departments are particularly averse to ‘reflection back’ following receipt of submissions: the Department of Home Affairs consistently deserves special mention.

Australian policymakers almost always skip summary of ‘what we heard’ altogether. Instead, they gleefully jump to stating whatever they have concluded, usually in a publication that selectively cites from a few submissions that support whatever it is that the policymaker then proposes. As a result, only those submitters with the resources to keep up with the range of views expressed – in other words, that have the human power (aka money) to read and distil thousands of pages – can know which views then expressed by the policymaker should be reasonably contested.

In practice, skipping summary of ‘what we heard’ unfairly skews the policy development debate towards those entities that have the most resources to ‘keep up and out there’ with their views. This skew is not conducive to good public policy outcomes – unless the outcome you want is whenever the government wanted to do in the first place, or the outcome promoted by those entities with the most resources.

Over the last decade, two important and technically complex public policy debates have been skewed in this way: amelioration of climate change, and how to ensure fair and socially responsible uses of data about consumers and other citizens. And within the galaxy of users and uses of consumer data, we have the complex, rapidly evolving, unstable and misunderstood interplanetary system of digital advertising services and adtech.

The more technically complex the debate, and the greater the information asymmetries (concentration of relevant information in a few hands), the greater the need for policymakers to be humble about what they know, and what they really need to learn. Policymakers in Canberra and statutory regulators may mean the best, and try hard to achieve the best, but they often don’t know enough to quickly navigate to the right outcome. Good policy takes time, acquiring knowledge and building new skills.

 

All public submissions to Phase 1 of the Privacy Review can be viewed here

 

Recommended

Skip to toolbar