Guest post by Edgar T. Wilson, writer, consultant and analyst.
Is there an unspoken fear among caregivers that the subtext of all this digital disruption is a devaluation of the human element?
In countless industries, workers and analysts alike watch the slow march of technology and innovation and see as inevitable the takeover of human tasks by robots, AI, or other smart systems. We watched as the threat of outsourcing transformed into a reality of automation in industrial sectors, saw drones take on countless new roles in the military and in commerce, and now we hear about how driverless cars, self-checkout kiosks, and even robotic cashiers in restaurants are all waiting in the wings to dive in and displace even more formerly human occupations.
And looking at how EHRs — by virtue of their cumbersome workflows alone, not to mention all the documentation and growing emphasis on analytics and records-sharing–are taking flack for burnout and frustration in hospitals across the country, it hardly seems a reach to suggest that maybe America’s caregivers are feeling not just burdened by technology, but threatened.
Digital records are already changing what doctors and nurses do, how they work, and what is expected of them — it must surely be only a matter of time before their roles start getting handed over to the robots and supercomputers … right?
Change, Not Replacement
While some jobs or roles may face elimination through automation, the more common effect is transformation. In healthcare, that may mean that for many their title is the same — perhaps even the education and certification standards that go along with it–but their actual functions and roles in context will be different.
We see this already with respect to EHRs. The early, primitive documentation workflows and reporting obligations have drawn ire from clinicians who see their autonomy under attack by digital bureaucracy. But this is naturally destined for correction; medicine has advanced through trial and error for centuries, and the 21st century is no different.
The transition and disruption doesn’t manifest exclusively as growing pains. Consider the role of medical laboratory scientists and technologists: the Obama administration is pushing for a cure for cancer based on the advancement of personalized medicine; patient-centered care is becoming a priority among all caregivers as well as a quality metric in health centers across the country; genomic testing is seeing growing demand on the consumer side, as well as applications in a growing array of clinical settings.
All of these trends point to the medical lab as a newly central component of the modern care center, treatment plan, and information hub. The demands all these new technologies and applications put on laboratory professionals requires them to do more learning, adapting, and leading than ever before, especially to integrate the latest and greatest devices and tests available.
Simply put, machines are still fallible, and require assistance in providing critical context, to supplement their ability to accurately read, diagnose, and self-regulate to ensure accuracy and consistency, not to mention proper application in the clinical setting.