DeepMind Health, an artificial intelligence (AI) subsidiary of Google, has launched a blockchain-like project designed to allow hospital technicians to predict, diagnose and prevent illnesses.
DeepMind Health has been working with London’s Royal Free Hospital to develop kidney monitoring software and has faced criticism from patient organizations over data sharing agreements that could give DeepMind too much power over the National Health Service (NHS), according to The Guardian.
Mustafa Suleyman, co-founder and head of applied AI at DeepMind, and Ben Laurie, head of security and transparency, described the “Verifiable Data Audit” project in a recent DeepMind blog.
Verifiable Data Audit
A well-built digital tool will log how it uses data and be able to justify those logs if challenged, the blog noted. The more secure the audit process, the easier it becomes to have confidence about how data is used.
Verifiable Data Audit is designed to provide partner hospitals a real-time, proven mechanism to check data processing.
The audit’s role is to provide secure data services under the hospital’s instructions, with the hospital remaining in full control.
The ledger will have blockchain properties. It will be append-only, so once a record of data use is added, it can’t be erased. The ledger will also make it possible for third parties to verify that nobody has tampered with any entries.
How It Differs From Blockchain
But the ledger will differ from blockchain in a few important ways, the blog noted. Blockchain is decentralized, so the verification of a ledger is decided by consensus among a set of participants. To prevent abuse, blockchains require participants to carry out complex calculations, which incur costs.
This is not necessary for health service, which already has trusted institutions like hospitals or national bodies that can verify the integrity of ledgers.
The ledger can also be made more efficient by replacing the chain part of the blockchain, and using a tree-like structure instead. Every time an entry is added to the ledger, a “cryptographic hash” is created. The hash process summarizes the latest entry, as well as all of the previous values in the ledger. Hence, it is effectively impossible for someone to alter one of the entries, as it would change the hash value of that entry and that of the whole tree.
The Verifiable Data Audi will build a dedicated interface to authorize staff at hospitals to examine the audit trail of data in real-time. It will allow continuous verification that systems are working properly, and enable partners to query the ledger. Partners will be invited to run automated queries. In time, partners could be provided the option of allowing others to check the data processing, such as individual patients or patient groups.
DeepMind Seeks Public Trust
Suleyman told The Guardian DeepMind has undertaken a number of projects to build trust, including its founding membership of Partnership on AI and creating a board of independent reviewers for DeepMind Health.
Nicola Perrin, who oversees the Wellcome Trust’s “Understanding Patient Data” task force, supported the Verifiable Data Audit concept.
DeepMind is trying to use technology to help deliver an audit trail to track what happens to personal data, and particularly to check how data is used once it leaves a hospital or NHS, in a way that should be more secure than what has been done previously, Perrin said.
The approach could address DeepMind’s challenge of winning over the public, Perrin said. A criticism about DeepMind’s collaboration with the Royal Free was the challenge of distinguishing between uses of data for care and for research. The suggested approach could address that challenge, and it indicates they are trying to respond to the concerns.
Technological solutions will not be the only answer, Perrin said, but they will help provide trustworthy systems that give people more confidence about how data is used.
Image from Shutterstock.
No comments:
Post a Comment