Doctors Need More Brains

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar Wilson

It isn’t that doctors aren’t skilled, intelligent or capable enough—it is that the demands being placed on them are too great.

Time and documentation demands mean that something has to give. As many physicians have pointed out over the years of the HITECH Act’s implementation, the thing that normally “gives” is facetime with patients: actual, hands-on delivery of care and attention. Instead, they are driven to input data for documentation, follow prompts on EHR interfaces, ensure their record-keeping practices will facilitate correct coding for billing, as well as tip-toeing around HIPAA and the explosion of security and privacy vulnerabilities opened up by the shift to digital.

The reality of modern medicine—and especially the rate at which it evolves, grows, and becomes outdated—means that doctors need what most every other industry has already integrated: more brains. Not simply in the form of EHRs for record-sharing, or voice-to-text applications as a substitute for transcriptionists, but as memory-supplements, or second brains.

As a species, humans are also evolving away from memory as a critical element of intelligence, because we now have devices—“smart” devices—always on, always on us, and always connected to the ultimate resources of facts and data.

Our smart devices—phones, tablets, etc.—are gateways to the whole of human knowledge: indexes of information, directories of images, libraries question and answer exchanges. In effect, we are increasingly able and willing to offload “thinking” onto these devices.

Supplement or Supplant?

Depending on the context and application, this trend is both helpful and potentially harmful. For those prone to critical thinking and equipped with analytical skills, offloading some elements of memory to these devices is a question of efficiency. Even better, the more they practice using it, the more effective they become at integrating devices into their cognitive tasks. In others (those less prone to think critically), it is a shortcut that reduces cognitive function altogether: rather than a cognitive extension, the devices act as substitutes for thinking. Similarly, increasing over-reliance on the internet and search engines further diminishes already deficient analytical skills.

The standard roadmap for a medical education entails a lot of memorization—of anatomy, of diseases, of incredible volumes of data to facilitate better clinical performance. It isn’t memorization simply for the sake of recitation, though; it is the foundation for critical thinking in a clinical context. As such, medical professionals ought to be leading candidates for integrating smart devices not as crutches, but as amplifiers of cognition.

So far, that has been far from the dominant trend.

Enter the Machine

Integrating computers as tools is one thing, and even that has proven an uphill battle for physicians: the time and learning curve involved in integrating EHRs alone has proven to be a recurring complaint across the stages of Meaningful Use and implementation.

Patient engagement—another of the myriad buzzwords proliferating the healthcare industry lately—is another challenge. Some patients are bigger critics of the new, digitally-driven workflows than the most Luddite physicians. On the other hand, some patients are at the bleeding edge of digital integration, and find both care providers and the technology itself moving too slowly.

Physicians, caught between extremes, are not only held responsible for health outcomes (value-based care initiatives overwhelmingly put the burden of proof on clinicians, all but ignoring patient compliance and lifestyle as a contributing factor), but for getting patients engaged digitally. So not only are doctors still doctors, they are ambassadors for preventative medicine and online patient portals—as well as pillars for the whole EHR industry.

To make technology a help, rather than a hindrance, medical education and practice alike must align with technology as a cognitive extension, a memory device that aids critical thinking and knowledge application. The medical school paradigm has to be updated to fit the needs of the 21st century—and that means making smart devices a standard component of learning and practicing from the very beginning.

Two heads are only better than one if they can work together, rather than getting in each other’s way. It may not necessarily be too late for those with years of analog experience in healthcare to catch up, but it is certainly overdue for schools anchored to analog pedagogy to go digital, and better prepare future generations of clinicians.


Write a Comment

Your email address will not be published. Required fields are marked *