Expanding open data requires social license: information mandarins

By Stephen Easton

January 23, 2017

A concept of data mining

At a conference last year, a prominent public sector technologist said he wished “everyone would stop caring about privacy so much” when asked which barrier to digital transformation he would like to magically remove.

Pedro Harris, who led the NSW whole-of-government data centre project, is no longer in the public sector but the exasperation that lay behind his wave-a-magic-wand scenario remains widespread. Obviously, it’s not going to happen.

The Productivity Commission, in its draft report on data availability and use, proposes extensive reform that aims to unleash more of the latent value in large stores of public and private sector information. It recognises that any overhaul must clarify and strengthen individual rights to privacy.

Two of the Commonwealth’s information custodians, Information and Privacy Commissioner Timothy Pilgrim, and National Archives director-general David Fricker both welcome the opportunity for serious reform. Both also hope the final report in March is significantly revised compared to the draft.

The time is ripe for this reform, which could have far-reaching consequences all over the country. Big data enthusiasts want to go full-steam ahead but issues that hit the mainstream news last year threaten to put the brakes on, in the public sector at least.

The delayed backlash to the changed Census data collection process showed how easy it is for public servants to underestimate public unease about the new era of big data and assume they have the social license to pursue their newest data-related project, when in fact they do not.

Two re-identification incidents that saw the Department of Health and the Australian Public Service Commission rethink their respective understandings of how to anonymise or de-identify data, in order to escape the Privacy Act, did not help.

Social license

Fricker says the PC’s draft recommendations, including a simpler privacy framework and a clear “comprehensive right” for Australians to control data about themselves, are a useful addition to the debate.

“As we move further and further into the information society, and we’re all living in this world with incredible information abundance … it is very important for us to keep revisiting this contract, if you like, between government and the people, to make sure that the information we collect and the way we manage that information protects and respects personal privacy, it maximises government accountability, and it is used in the most efficient way to deliver better government services,” he told The Mandarin.

Timothy-Pilgrim
Timothy Pilgrim

Timothy Pilgrim has a dual interest as both the privacy commissioner and the information commissioner, responsible for administering the Freedom of Information Act — which, he points out, already recognises government information as a national asset that should be made available publicly for the good of the community wherever possible.

“But at the same time, putting on my privacy commissioner’s hat, where that data is going to be derived from personal information, we must ensure that it is protected and handled in a way that really meets with the broader community expectation about what’s going to happen with their personal information,” he told The Mandarin.

And it’s government agencies that need to work on building that social license, he explained, not so much the “big G” government.

“I think on the whole, the community is very accepting that this data is very valuable and important … but they want to understand what is happening and they want to be more aware about how that information is going to be used,” said Pilgrim.

“And I think we’ve seen a bit of an inkling about the nervousness or the concern the community will have around this through the recent issues with the Census. Albeit that there was a denial of service attack, if you look around that, there were already issues of community concern around the period for which the identifying information would be held.”

“And although there was the argument put forward by the ABS that they had advised that this was the intention, I think we saw through media articles and general concerns and issues coming to our office [that there was not] a great understanding that that was going to happen.”

Pilgrim’s submission suggests the PC’s draft did too little to explain how governments and their agencies would build social license for the reforms it proposed. Simply giving people a new comprehensive right to access their information “will not necessarily build support for the government to use and share individuals’ data for secondary, unspecified purposes”, argues Pilgrim:

“A social licence for data use will be built on a number of elements. First, governments must be transparent about their intentions, so that individuals actually understand what the data reforms may mean for their personal information. Second, there must be meaningful consultation with individuals, to find out what uses of data the broader community believes are valuable, and reasonable. Third, governments must respond and take public opinion into account when making decisions.

This may mean, ultimately, that there is community support for only some proposed uses of data — rather than all those that government and business may desire. However, in a democracy, having broad community support for reforms of this nature is essential.”

He firmly disagrees with those who see the Privacy Act as a “blocker” standing in the way of releasing data, and thinks the draft report was too vague on what exactly would replace the current “flexible, technology-neutral and principles-based” regime. His submission adds:

“Most crucially, the draft report does not address what new uses of data should be permissible under the model.”

His submission also argued the PC’s proposed new definition of “consumer data” would duplicate the existing definition of “personal information” in the act. Neither does he support the proposed “comprehensive right” in its draft form.

More recently, however — after Pilgrim spoke to The Mandarin and sent his second submission to the PC last year — The Conversation reported the legal definition of “personal information” in the act had been narrowed significantly by a Federal Court ruling in a case between the privacy commissioner and Telstra.

Pilgrim also told the PC he was concerned that dividing regulatory power between his office and other bodies like the Australian Competition and Consumer Commission  would only fragment and overcomplicate the system:

“However, I am broadly supportive of proposals which strengthen individual rights to access information. In my view, the Commission’s proposed expansions to existing access rights can be achieved most efficiently through enhancing the existing framework in the Privacy Act.”

The commissioner believes the cases that come across his desk demonstrate that a significant number of Australians are aware of their rights and how to exercise them — 83% of FOI requests are people trying to get their personal information, and 16% of complaints he receives are about difficulties in doing so.

While broadly supporting the PC’s intention to strengthen citizens’ rights, he can’t see why a new act is required when the existing laws could simply be amended, retaining their well known and understood principles.

“What we want to avoid is having too much of a potential for regulatory overlap, which can translate into increased regulatory burden on organisations and agencies trying to work out where they should go when dealing with data in a particular way,” said the two-hatted commissioner.

De-identification details

One key issue that reared its head last year and isn’t going away is the fact that de-identification of data is not as simple and reliable as some public servants thought — and the recent Federal Court ruling seems to confirm that data which could easily breach a person’s privacy after it is linked with other data is not protected by the Privacy Act.

Pilgrim argues the best practice — a risk-based methodology using the appropriate cryptographic methods — needs to be applied more consistently. It must also be recognised that de-identification is not an absolute assurance, but a trade-off between the utility of the data and the risk to the privacy of those individuals the supposedly anonymised data reflects.

Last year, the commissioner for both privacy and FOI ran a workshop attached the GovInnovate conference which successfully laid bare the considerable differences of opinion between different experts and the need for a much better understanding of de-identification in the public sector. In his expansive and detailed submission to the PC, Pilgrim argues the unreliability of de-identification means open data releases are of limited usefulness:

“In my view, it is unlikely that any high-value datasets containing personal information will be able to be sufficiently de-identified to enable general, open publication (in a manner that also preserves the integrity of that data). These types of datasets require additional controls to be in place to prevent re-identification.

The ‘trusted user’ model proposed in the draft report could instead be used to increase the value and availability of these datasets, while maintaining appropriate access controls.”

The PC needs to develop a robust and accountable way of dishing out “trusted user” status, he adds, and should probably also lower its expectations for the number of organisations that would make the cut.

One model Pilgrim suggests could see a small number of trusted users — government statistical agencies, for example — “do the analytical work for researchers or on behalf of researchers and policymakers and give them back just the raw information they need” with no possibility of re-identification.

“I think there is certainly a need for capability building in the APS around the issues of de-identification and data use generally,” he added, “and I think that work is already going on.”

“It’s being commenced under the public data branch that’s running out of the Prime Minister’s department at the moment and there’s a secretaries’ group and a deputy secretaries’ committee that are looking at those issues in particular, and I attend those meetings as well.”

Pilgrim has also been working to update the guidance from his own office and says existing pools of public sector expertise like Data61 need to be drawn on to improve technical proficiency and decision-making.

The Archives’ angle

Over at the National Archives of Australia, David Fricker had a similar take on the PC’s first draft to Pilgrim.

David Fricker
David Fricker

He told The Mandarin the draft report “had quite a few very good observations and recommendations … for us to carry that discussion forward” and the Archives contributed seven recommendations to that conversation ahead of the final report, in a submission that asks for more recognition as a “key institution” in the design of a future framework:

“The Archives’ reiterates its current role in the designation of Government data(sets), and questions the extent to which established expertise and experience has been considered in the proposed Framework.

“The Commission’s approach overlooks provisions such as Principles on Open Public Sector Information, Australian Government Public Data Policy and the Open Government Partnership – National Action Plan, as well as equivalent policies in the Australian States.”

The Archives seems concerned that the PC’s draft recommendations for a new Data Sharing and Release Act and a National Data Custodian could undermine its role in deciding on the status of all Commonwealth records, and the information management obligations of federal entities.

It suggests there is a risk of the custodian “duplicating efforts of the National Archives” and “recommends any new legislation should be carefully considered with a thorough examination of implications under the Archives Act” — noting a very recent commitment to “a simpler framework for information access laws, policies and procedures” in Australia’s Open Government National Action Plan.

Citing the Belcher Red Tape Review, the Archives hopes any eventual reform “does not increase the burden on Australian Government agencies by making them engage in separate processes for the identification, management and distribution of data assets” — pointing out that datasets given a new designation as “national interest data” would already be protected under the Archives Act.

The NAA also took issue with the PC report’s claim that government datasets need that new designation because public servants currently keep them “merely … for compliance, record-keeping or audit” purposes. The Archives says it already identifies certain important datasets and ensures they are “kept permanently as a national resource, preserved and available for public access”.

It argues that “strengthening the mandate for open material, while leaving the responsibilities to entities with the appropriate expertise” would be the most efficient way forward.

The Archives wants to make sure any new data legislation dovetails with its Digital Continuity 2020 policy, which sets milestones for federal agencies to identify all information assets and a plan to manage them. By the end of this year, the goal is for all agencies to designate a chief information governance officer, or add equivalent responsibilities to another senior executive for smaller organisations.

“At the moment I think it’s quite common for organisations to understand the computer hardware [and] software that they have — and they can even put a dollar-value on those sorts of things and they’ll understand how they’re writing those assets off and when they’re being replaced; they’ll be on their balance sheet,” said Fricker.

“But we’ve got to take the same approach to the data because once that technology is gone and becomes obsolete, the data lives on. The data is actually the residual value of all of those investments, and that’s why we need to have that information governance.”

Fricker says agency heads right across the APS all agree that information governance is now a crucial part of the wider corporate governance picture, a view that is also supported by the PC’s recommendations.

In the corporate world, it is fairly easy to understand data as an increasingly valuable commodity. But how does that realisation translate into the public sector?

“Data is absolutely at the heart of the digital transformation of government, because once you lock on to the idea that it’s the data that has enduring value — once you lock on to the idea that technology these days is getting smaller and more temporary, and data is getting bigger and more permanent — then you structure your strategies around that sort of thinking and it makes you much more agile,” said Fricker.

“It means you can move your organisation from one platform to the next, you can enter into much better service delivery arrangements, you can pick up new technology very quickly, and you can engage with industry in much more cost-beneficial ways.

“Because you’re not tying yourself down to sluggish technology, what you’re doing is concentrating on creating good data assets which give you a platform to quickly move on to the next wave of technology and invent new and better public services.”

The Archives acknowledges that its role was established long ago in a “non-digital world” but reminds the PC that it too has also been pursuing reform, “to ensure that the [Archives Act] meets the challenges of the digital age” and it clearly doesn’t want to see it derailed.

About the author

Any feedback or news tips? Here’s where to contact the relevant team.

The Mandarin Premium

Try Mandarin Premium for $4 a week.

Access all the in-depth briefings. New subscribers only.