The Limits (and Realities) of Automation in Healthcare

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar T. Wilson

Is there an unspoken fear among caregivers that the subtext of all this digital disruption is a devaluation of the human element?

In countless industries, workers and analysts alike watch the slow march of technology and innovation and see as inevitable the takeover of human tasks by robots, AI, or other smart systems. We watched as the threat of outsourcing transformed into a reality of automation in industrial sectors, saw drones take on countless new roles in the military and in commerce, and now we hear about how driverless cars, self-checkout kiosks, and even robotic cashiers in restaurants are all waiting in the wings to dive in and displace even more formerly human occupations.

And looking at how EHRs — by virtue of their cumbersome workflows alone, not to mention all the documentation and growing emphasis on analytics and records-sharing–are taking flack for burnout and frustration in hospitals across the country, it hardly seems a reach to suggest that maybe America’s caregivers are feeling not just burdened by technology, but threatened.

Digital records are already changing what doctors and nurses do, how they work, and what is expected of them — it must surely be only a matter of time before their roles start getting handed over to the robots and supercomputers … right?

Wrong.

Change, Not Replacement

While some jobs or roles may face elimination through automation, the more common effect is transformation. In healthcare, that may mean that for many their title is the same — perhaps even the education and certification standards that go along with it–but their actual functions and roles in context will be different.

We see this already with respect to EHRs. The early, primitive documentation workflows and reporting obligations have drawn ire from clinicians who see their autonomy under attack by digital bureaucracy. But this is naturally destined for correction; medicine has advanced through trial and error for centuries, and the 21st century is no different.

The transition and disruption doesn’t manifest exclusively as growing pains. Consider the role of medical laboratory scientists and technologists: the Obama administration is pushing for a cure for cancer based on the advancement of personalized medicine; patient-centered care is becoming a priority among all caregivers as well as a quality metric in health centers across the country; genomic testing is seeing growing demand on the consumer side, as well as applications in a growing array of clinical settings.

All of these trends point to the medical lab as a newly central component of the modern care center, treatment plan, and information hub. The demands all these new technologies and applications put on laboratory professionals requires them to do more learning, adapting, and leading than ever before, especially to integrate the latest and greatest devices and tests available.

Simply put, machines are still fallible, and require assistance in providing critical context, to supplement their ability to accurately read, diagnose, and self-regulate to ensure accuracy and consistency, not to mention proper application in the clinical setting.

More Technology Demands More Humanity

“The critical role of the technologist is to determine if the result or identification is rational and appropriate for this patient, disease process or body system,” explain Joel Mortenson and Beth Warning.

Both professors with the University of Cincinnati’s online med lab science program, these instructor-practitioners see how the pace of change in the field is not always reflected by commensurate changes in how the country trains, evaluates, and certifies medical professionals.

“Upon review of the ‘Body of Knowledge’ task list from ASCLS for Microbiology, less than 10 percent of the tasks listed are assigned to the section on Molecular methods compared to the section concerning methods for identification for other organisms,” they write. “Few, if any, of these tests are used in routine clinical laboratories today…With very limited exceptions, virtually no classic virology methods are performed in the majority of clinical laboratories. There are definitely exceptions, and very important ones in research and state-level public health laboratories, but this type of training is provided in that setting if it is needed at all.” (emphasis mine).

There are two important takeaways from their comments. First, education–even clinical education–too often lags behind actual practices, in part because certification exams lag behind, yet still serve as gatekeepers to, clinical practice. Professors can innovate in the classroom and curriculum (as Mortenson and Warning advocate) but that doesn’t change the system that students must navigate.

Second, and unsurprisingly, they see the clinical workplace as the best place for remediation and new skill development. Hardly surprising, considering the requirements of Continuing Education inherent in medicine, yet this truism seems conspicuously absent from discussions of how emergent technology changes elements of caregiving. Learning new best practices is one thing, apparently, but embracing new IT is grounds for rage, retirement, and general burnout.

But for every angry blog written by an MD opening with “There was a time,” we have a dozen optimistic articles fantasizing about “For the first time,” and exploring new possibilities in medicine, care, survival, and quality of life itself.

The average patient is no more interested in getting robo-handled at his annual exam than the average doctor is thrilled to turn over his rounds to an algorithm. This is a delicate transition for everyone, and it is important that we don’t mistake the technical changes for a disenfranchisement of medical professionals.

Part of that entails a new approach to implementing changes: blunt force in the shape of financial penalties and incentives has been the go-to mechanism for everything from Meaningful Use to value-based care. Yes, doctors have to be able to make a living, so they may eventually come around, but that isn’t exactly a wholehearted buy-in.

We All Need an Upgrade

The sheer complexity of modern medicine is even changing the role of the patients themselves. From the bad journalism heralding false breakthroughs, to the constant barrage of pharmaceutical advertising, to the gauntlet of mental and financial abuse that is the insurance and medical payment system, getting well is not as simple as visiting the family doctor any more.

An important thing to keep in mind is that even as we astound ourselves with the things we can get our robots and machines to do, the same things that come most naturally to us as humans are the hardest things for computers to replicate and virtually impossible for us program into even our “smartest” machines.

Put differently, it turns out that much of the time, there really is no substitute for the human element. From the ability to correctly assign pronouns in an ambiguous sentence, to picking up on sarcasm or other nuances of communication, computers lag behind even young children, and lack the instinct to make up for their knowledge/programming gaps.

The bottom line is that we all need medical care, and medicine needs humanity. Getting technology to work for us means getting comfortable working with our technology. Schools, educators, administrators, practitioners, and patients all have to face a reality of change, limited automation, and large-scale digital integration.


Write a Comment

Your email address will not be published. Required fields are marked *