By Rick Wilminko, consultant, netlogx.
The news that Ascension and Google are working together on a system using machine learning, called Project Nightingale, at first seems like a step forward toward better patient care. Because it’s challenging for any healthcare provider to exchange information about patients and patient care, it only makes sense that the healthcare industry would look for technology solutions that could solve some of these obstacles.
The design of a new software system that could suggest changes in care and make medical records easily accessible to any doctor treating a single patient would help alleviate many of the challenges our healthcare system faces today.
However, it’s important that these two entities move through the process with great care and consideration. Google is no stranger to controversy regarding data privacy, machine learning and ethics. In April, the tech leader vanquished its AI ethics board after a public outcry over board members and the potential misuse of Google’s AI systems. Further, Google has been accused of inappropriately using data to personalize online marketing and advertising. While it is true that Project Nightingale doesn’t break any laws under the Health Insurance Portability and Accountability Act (HIPAA), what is most concerning is the potential misuse, release or breach of the data without patient permission.
As a senior consultant for an IT security and risk management firm, I have spent years guiding government health agencies toward common-sense approaches to data management using technology. In order to ensure the greatest transparency for Project Nightingale, it is crucial to give patients an option to opt-in or -out of the program namely because of the risk for data to be breached or misused.
First, Google and Ascension should be tasked with clearly outlining how this project will progress in the future, or who the intended users are. Is it only for healthcare providers? If Google has access to patient data, will it be marketed in the future? While Google may say it will not share or sell user data, we don’t know how Ascension plans to use this data set in the future.
Using the information as part of a medical study that could help improve patient care is much different than using patient data to better market or advertise healthcare solutions or pharmaceutical products. Without this information being publicly available, how are patients and the public supposed to know safeguards were set up at all? This is one reason that so many in the technology industry, including myself, are hesitant to support this project.
Second, any technology is at risk for a data breach, no matter what kind of security is in place. Any device can be hacked, whether by a nefarious organization or a person living in their mother’s basement. Therefore, it is vital to stay a step ahead while anticipating vulnerabilities and risks.
The data we’re discussing is comprised of personal, intimate details and patients should have the choice to accept this risk and make informed decisions about where and how their personal health details are used, even if there is no identifying information. This is a key reason the European Union (EU) passed the General Data Privacy Regulation or GDPR: to give consumers ownership of their personal data online.
To get Project Nightingale back on track, Ascension should increase transparency for patients by offering an opt-out and partner with patients who are accepting in order to leverage the project’s success. Secondly, the healthcare organization should properly outline what, how, and when patient data will be used to give an extra layer of protection in case there is a data breach of the information shared between Ascension and Google.
Finally, Ascension should clearly stipulate to the public the breadth of its partnership with Google in an effort to protect patient data from being marketed or sold for advertising purposes. At the end of the day, these steps are the only ways forward for a project that has the potential to transform, disrupt and improve healthcare — or destroy it entirely.