Blurring the Lines of the Scope of Practice

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar T. Wilson

When we talk about technology disrupting healthcare, we aren’t just referring to changes in the accuracy of health records or the convenience of mobile care; the real disruption comes in the form of fundamental challenges to traditional scopes of practice.

What Should We Do?

Scope of practice, broadly, is determined by a combination of liability and capability. Lead physicians carry greater liability than the bedside nurses assisting in patient care, because the care plan is directed by the lead physician. Likewise, the extra years of education and practice are assumed to increase the capacity of physicians to lead their care teams, make decisions about how the team will go about its work, and parse all of the information provided by the patient, nurses and other specialists involved with each case.

In every other industry, productivity increases come from technology enhancing the ability of individuals and teams to perform work. Email saves time and money by improving communication; industrial robotics standardize manufacturing and raise the scale and quality of output. Every device, app and system allows individuals to scale their contribution, to do more and add more value. Word processing and voice-to-text enable executives to do work that might otherwise have been performed by a secretary or typist. Travel websites allow consumers to find cheap tickets and travel packages that would previously have required a travel agent to acquire.

In healthcare, technology is changing the capacity of the individual caregiver, expanding what can be done, and often how well it can be done. These improvements, along with a growing need for healthcare professionals and services, are challenging traditional notions of scope of practice–for good and bad.

New Beginnings

Some of the changes to scope of practice are positive, necessary, and constructive. For example, technological literacy is necessary at every point in the care continuum, because interoperable EHRs and the vulnerability of digital information means that everyone must contribute to cyber security. In a sense, caregivers at every level must expand their scope of practice to incorporate an awareness of privacy, security,and data management considerations.

By extension, all caregivers are participating as never before in the advancement of clinical research, population health monitoring, and patient empowerment simply by working more closely with digital data and computers. As EHR technology iterates its way toward fulfilling its potential, caregivers and administrators are being forced to have difficult conversations about priorities, values, goals and the nature of the relationship between patient, provider, system, and technology. It is overdue, and foundational to the future of healthcare.

Is There A Nurse in the House?

The trend in healthcare toward prevention and balancing patient-centered care with awareness of population health issues puts primary care in a place of greater importance than ever. This, in turn, is driving a shift in the education of nurses to promote more training, higher levels of certification, and greater specialization to justify relying on nurses to fulfill more primary care roles. They are becoming better generalists and specialists, capable of bolstering teams as well as leading them.

Continue Reading

Doctors Need More Brains

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar Wilson

It isn’t that doctors aren’t skilled, intelligent or capable enough—it is that the demands being placed on them are too great.

Time and documentation demands mean that something has to give. As many physicians have pointed out over the years of the HITECH Act’s implementation, the thing that normally “gives” is facetime with patients: actual, hands-on delivery of care and attention. Instead, they are driven to input data for documentation, follow prompts on EHR interfaces, ensure their record-keeping practices will facilitate correct coding for billing, as well as tip-toeing around HIPAA and the explosion of security and privacy vulnerabilities opened up by the shift to digital.

The reality of modern medicine—and especially the rate at which it evolves, grows, and becomes outdated—means that doctors need what most every other industry has already integrated: more brains. Not simply in the form of EHRs for record-sharing, or voice-to-text applications as a substitute for transcriptionists, but as memory-supplements, or second brains.

As a species, humans are also evolving away from memory as a critical element of intelligence, because we now have devices—“smart” devices—always on, always on us, and always connected to the ultimate resources of facts and data.

Our smart devices—phones, tablets, etc.—are gateways to the whole of human knowledge: indexes of information, directories of images, libraries question and answer exchanges. In effect, we are increasingly able and willing to offload “thinking” onto these devices.

Supplement or Supplant?

Depending on the context and application, this trend is both helpful and potentially harmful. For those prone to critical thinking and equipped with analytical skills, offloading some elements of memory to these devices is a question of efficiency. Even better, the more they practice using it, the more effective they become at integrating devices into their cognitive tasks. In others (those less prone to think critically), it is a shortcut that reduces cognitive function altogether: rather than a cognitive extension, the devices act as substitutes for thinking. Similarly, increasing over-reliance on the internet and search engines further diminishes already deficient analytical skills.

The standard roadmap for a medical education entails a lot of memorization—of anatomy, of diseases, of incredible volumes of data to facilitate better clinical performance. It isn’t memorization simply for the sake of recitation, though; it is the foundation for critical thinking in a clinical context. As such, medical professionals ought to be leading candidates for integrating smart devices not as crutches, but as amplifiers of cognition.

So far, that has been far from the dominant trend.

Enter the Machine

Integrating computers as tools is one thing, and even that has proven an uphill battle for physicians: the time and learning curve involved in integrating EHRs alone has proven to be a recurring complaint across the stages of Meaningful Use and implementation.

Patient engagement—another of the myriad buzzwords proliferating the healthcare industry lately—is another challenge. Some patients are bigger critics of the new, digitally-driven workflows than the most Luddite physicians. On the other hand, some patients are at the bleeding edge of digital integration, and find both care providers and the technology itself moving too slowly.

Continue Reading