The nation’s second-largest health system, Ascension, has agreed to allow the software behemoth Google access to tens of millions of patient records. The partnership, called Project Nightingale, aims to improve how information is used for patient care. Specifically, Ascension and Google are trying to build tools, including artificial intelligence and machine learning, “to make health records more useful, more accessible and more searchable” for doctors.
Ascension did not announce the partnership: The Wall Street Journal first reported it.
Patients and doctors have raised privacy concerns about the plan. Lack of notice to doctors and consent from patients are the primary concerns.
As a public health lawyer, I study the legal and ethical basis for using data to promote public health. Information can be used to identify health threats, understand how diseases spread and decide how to spend resources. But it’s more complicated than that.
The law deals with what can be done with data; this piece focuses on ethics, which asks what should be done.
Beyond Hippocrates
Big-data projects like this one should always be ethically scrutinized. However, data ethics debates are often narrowly focused on consent issues.
In fact, ethical determinations require balancing different, and sometimes competing, ethical principles. Sometimes it might be ethical to collect and use highly sensitive information without getting an individual’s consent.
Public health ethics are useful to evaluate activities that affect population health. A recent report by the World Health Organization (WHO) describes public health ethics with four principles:
- Common Good — Does the activity promote collective benefit?
- Equity — Does the activity reduce the burdens or risks to health or opportunity?
- Respect for Persons — Does the activity support individual rights and interests?
- Good Governance — Does the activity have processes for public transparency and accountability?
Public health ethics is an appropriate framework for evaluating Project Nightingale, given its massive scale. But the current health care context is relevant.
The system and its struggles
For over a decade, scholars have argued that technological solutions are needed to address three major challenges to how the health system uses information.
First, the health system struggles to integrate new knowledge into patient care. New medical evidence takes 17 years to change clinical practice, on average. The breakneck pace of science challenges doctors to keep up. And, applying modern medical knowledge requires doctors to consider more factors than is humanly possible.
Second, information is central to preventing many medical errors, the third leading cause of death in America. Communication problems, judgment errors and incorrect diagnosis or treatment decisions can have devastating consequences for patients.
Third, the system does not learn from care. For example, a doctor and patient might try several different medications before finding the right one. One medication might not help, another might cause awful side effects, and finding the best medication might take months or years. The health system does not learn from that care process. Individual providers will gain knowledge over a lifetime, but that knowledge is never aggregated or shared efficiently.
To help address these challenges, the Institute of Medicine in 2007 introduced a vision for a learning health system that would quickly learn from patient care and use that knowledge to improve future care.
The concept is simple, but learning health systems require sophisticated information technology platforms capable of extracting knowledge from the existing evidence and millions of treatment records.
The benefits of Project Nightingale
Project Nightingale appears to align with the learning health system concept.
Systematically improving health care is a clear common good.
Although a learning health system requires sharing patient data, patients stand to benefit from improved health care. Reciprocal data sharing by patients for a collective benefit is a prototypical example of the “common good” principle in public health ethics.
Project Nightingale might also improve health equity. For example, minorities and pregnant women are underrepresented in research studies, raising concerns that some medical knowledge might not be well tailored to these patients. A learning health system would improve understanding of what treatments are effective and safe for these underrepresented populations.
For small-scale activities, respect for persons usually demands giving people an opportunity to make a free and informed decision to participate. However, for activities carried out at the scale of the whole population, it is possible to show respect for persons by engaging the public and inviting them into the decision-making process. It is not clear whether Ascension or Google involved the public or patients in Project Nightingale.
The downsides
Some patients have criticized Project Nightingale because it does not have an “opt-out” for patients who do not want their information shared.
However, opt-out systems raise ethical concerns, too. They permit free riders who will benefit from the knowledge gained from the participants. Second, knowledge from a learning health system could be biased if enough people opt out. If so, opting out could expose others to riskier health care.
Good governance is critical to support a “common good” activity that conflicts with some individual interests. Transparency and accountability are crucial to keep the parties honest and open to public scrutiny. They also empower people to demand government action against an activity that cannot be ethically justified. There is little, if any, reported evidence that Project Nightingale has sufficient transparency or accountability processes. This is likely to be the biggest ethical challenge to Project Nightingale.
Issues of consent
Some of the biggest concerns have been about consent. However, public health ethics do not always require consent. One recent WHO ethical guideline says:
“Individuals have an obligation to contribute … when reliable, valid, complete data sets are required and relevant protection is in place. Under these circumstances, informed consent is not ethically required.”
The basic argument is that individuals have a moral obligation to contribute when there is low individual risk and high population benefit.
Currently, the public does not know enough about Project Nightingale to make definitive ethical judgments. However, public health ethics likely provides some support for what Google and Ascension are trying to do. The more critical ethical issue might turn on how Google and Ascension are doing it.
Cason Schmit, Assistant Professor of Public Health, Texas A&M University
This article is republished from The Conversation under a Creative Commons license.
Shares