algorithms, sandpit, public data

Computer Business Review

Parliamentary committee had called for “urgent action” amid algorithm partnering deals

The government has cautiously welcomed proposals by the House of Commons’ Science and Technology Committee to “realise more value” tied up in restricted public data sets, including those of the NHS, and hold them in so-called data trusts.

The Committee’s May 2018 report, “Algorithms in decision-making [pdf]”,  urged the government to “set out a procurement model for algorithms developed with private sector partners which fully realises the value for the public sector.”

It called for this process to be managed through data trusts.

“The Government could do more to realise some of the great value that is tied up in its databases, including in the NHS, and negotiate for the improved public service delivery it seeks from the arrangements and for transparency, and not simply accept what the developers offer in return for data access…” the report had said.

“The Government should explore how proposed ‘data trusts’ could be fully developed as a forum for striking such algorithm partnering deals. These are urgent requirements because partnership deals are already being struck without the benefit of comprehensive national guidance for this evolving field,” the Committee warned in the report earlier this year.

Its publication came amid concern over the NHS’s sharing of patient data with Google’s Deep Mind, with an independent panel set up by the company in June warning the company should “not use its assets or position to seek to extract excessive profits in its dealings with the public sector”.

Public Data Sale: Ye-es… 

In a response published late this morning, the government gave a cautious welcome to the proposal, responding: “The Crown Commercial Service regularly reviews with its customers… where there might be a need for commercial agreements to facilitate procurement of common goods and services and opportunities to maximise public value from working with the private sector.”

It added: “CCS will explore the points raised by the Committee and engage with relevant organisations involved in technology and data science, including the Alan Turing Institute and others, as it develops its category strategies in this area. Data trusts may offer one mechanism to support the development of these arrangements.”

What is a Data Trust, Exactly?

As Open Data Institute (ODI) policy advisor Jack Hardinges notes, that term is a vague one: “The UK AI review defined a data trust as a repeatable framework of terms and mechanisms. It described that data trusts ‘are not a legal entity or institution, but rather a set of relationships underpinned by a repeatable framework, compliant with parties’ obligations’.”

He adds in a recent blog: “This form of data trust seems designed to tackle the challenge of data stewards and prospective data users having to negotiate and establish data sharing agreements on a case-by-case basis. For example, the law firm Fieldfisher suggests that a repeatable framework of terms and mechanisms could be used when an organisation requires access to a dataset to complete processing or analysis on behalf of a client who holds the data.”

Data Needs to be Better… 

“The Government recognises that there is work to be done in order to ensure the quality of published data is of the highest calibre – including that it is in a commonly accessible and machine readable format, and conforms to metadata standards – both of which reduce friction in access and use, including by the AI community,” the government added in the report.

Critics have warned that the “fragmented and outdated nature” of so much of the NHS technology estate (and broader public sector) renders using machine learning or analytics solutions highly challenging.

As Computer Business Review recently reported, the NHS’s new Counter Fraud Authority is seeking software to do precisely that, across broad spectrum of data sources from Oracle databases and Excel spreadsheets, to SQL databases.

Read More