The Health Threat of Tech

Guest post by Edgar Wilson.

Edgar Wilson

We put a lot of faith in health technology: to make us better, to save our systems, to revolutionize healthcare. We may be looking at it from the wrong side entirely.

The social determinants of health matter more than our ability to deploy doctors or provide insurance; physical and mental, health is always more social than clinical.

But most of our health tech that is supposed to be revolutionary is aimed at clinical factors, rather than the social determinants of health. Yes, telehealth can increase reach, but it is still just a matter of touchpoints, not a fundamental change to the lifestyles and cultures that determine health.

Same with all our EHR systems creating more ways to record information, more ways to quantify patients, to put more emphasis on engagement and quality-based reimbursement. Even genomics and personalized medicine are taking a backseat to soliciting reviews and trying to turn the patient experience into  a number. It all puts greater focus on the clinical encounters, on how patients “feel” broadly about each minute aspect of their time in the medical facility.

A Digital Disease

As politicians trade blows on minimum wages and the ACA, the likelihood grows that insurance benefits and livable incomes (and lifestyles) will get pushed further out of reach for more people.

Modern work is tech-centric, which means lots of sitting, and manages to facilitate increased snacking without being particularly physical, a double-whammy that prevents employment or higher incomes from leading to healthier choices. For the less-skilled, normally accessible jobs are in the sights of automation and disruption. While tech is taking over medicine and opening up new possibilities, it is also transforming the labor market and closing countless doors to workers.

By extension, technology is changing the social framework that determines public health. Income inequality is growing, wage growth is stagnant, and no amount of awareness can change these front-of-mind concerns for people who may well want to eat better and exercise more, or even commit to seeing the doctor more often and following his or her advice to the letter.

Poor people can’t necessarily eat better as a simple matter of choice or doctor’s orders. Planning meals and purchasing healthful foods is a tax on limited resources–time as well as money. Working three jobs to pay the bills, many lower income individuals also don’t necessarily have time to exercise. And more likely than not, those working even high-paying jobs are sitting all day, sapping their bodies of energy and resilience, undoing the good of their intentions and smart devices  alike through attrition.

Continue Reading

The Destruction of Primary Care

Guest post by Edgar T. Wilson.

Edgar Wilson

Economists, especially today, like to talk about creative destruction.

Schumpeter considered it the “essential fact about capitalism,” that things have to fall apart so better things can take their place. The familiar is violently displaced by the unfamiliar, but superior, alternative.

Buggy whip makers are sent out of business as car makers take over the transportation space. Typists go extinct as word processing becomes cheap and ubiquitous. Blockbuster goes bankrupt, so Netflix and all its streaming peers can take over the space. The notion that the New can mean bad news for the Old is nothing unique to our modern era, though perhaps the speed and distribution of change thanks to globalization and digital technology means we see this more and more.

Well, 2017 may well be the beginning of the end for primary care as we once knew it.

The “Who’s on First” of Healthcare

As with any other example of creative destruction, the signs in primary care have been there for anyone to read, though perhaps the conclusion they point to hasn’t been quite as clear as the contributing forces.

Nursing, as a profession, has been on a long arc over the last century or so, transforming patient care as well as clinical organization and even leadership. Nurses have evolved from subordinates to doctors to, in some cases, replacements–notably, in primary care clinics, especially critical access hospitals or in areas where patients might not otherwise get to see a doctor outside of an emergency room.

Primary care provider shortages aren’t strictly limited to rural or remote areas. Thanks to demographic trends, more people are living longer and managing more chronic conditions. Keeping this swell of aging patients from charging into Emergency Departments en masse was part of the logic behind elements of the Affordable Care Act shifting resources to clinics run by NPs as opposed to MDs. While nurses face a shortage of their own, they have still been tagged as a key element of preserving and expanding access to primary care. In 2007, the shift in nursing toward a more central leadership role was codified by the Association of Colleges of Nursing with its designation of the Clinical Nurse Leader as a new official role for nursing professionals.

Simply put, consistent access to primary care supports prevention strategies, which are altogether cheaper and more effective than sending everyone through an ED or into a long-term care clinic. While many–notably, the American Academy of Family Physicians and the American Medical Association–muckrake over this disruption of scope of practice, the change is one of necessity. Nurses today provide critical care, and lead diverse clinical and professional teams to coordinate whole-person health.

With or without the Affordable Care Act, the shortage in primary care will persist. Expanded access through insurance only exacerbated the underlying issue. As Millennials enter middle ages and Boomers carry on retiring and living longer than ever, primary care will be stretched. Whatever comes out of the Trump administration or the ongoing scope of practice debates, primary care requires providers, and nurses are showing up to work.

Continue Reading

Healthcare’s New Mobile Age

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar Wilson

Mobile technology is impacting every element of American healthcare–from insurance and billing to documentation and caregiving, the impacts are being felt. The truly transformative element of the mobile revolution is not the technology itself, or the way it changes the look and feel of the tasks it affects. Despite complaints of the depersonalizing effect of technology, the ultimate value of mobile in the sector will be how it enhances and encourages communication.

Providers are Going Mobile

Already, flexibility and functionality have already drawn providers to mobile devices and solutions. Voice-to-text technology and similar automated solutions are in the offing to relieve the documentation burden that has dampered some amount of enthusiasm toward digitization. Bolstered by these advancements, caregivers will go from subjects of their EHRs to masters of patient encounters.

One of the huge benefits of mobility–as opposed to simply being networked on desktop computers or having a digital health records solution–is the capacity for greater native customization and app development. Native apps are like the currency of the mobile, smart device world providers are entering. Developers can deliver personal, branded interfaces that allow doctors to choose precisely how they want their dashboards to look, giving their EHRs a custom touch that has been sorely lacking throughout their implementation.

App-centric development will further reduce the friction of adoption and utilization, giving doctors a sense of empowerment and investment, rather than the bland inertia that has carried digitization thus far.

The personalization of the technology through app development will help boost adoption, and return the focus to what the technology enables, rather than how it looks or what it has replaced. Mobile technology’s strength will be in reconnecting doctors and patients, and creating bridges of data and communication across the continuum of care.

Continue Reading

What is Working in Healthcare?

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar Wilson
Edgar Wilson

In virtually every context that question might be asked, we struggle to give an honest, accurate answer.

It Works If You Believe It Works

Is the medication working? Difficult to say–it may be the placebo effect, it may be counteracted by other medications, or we may be monitoring the wrong indicators to recognize any effect. Is “working” the same as “having an effect,” or must it be the desired effect?

Alternative medicine confounds the balance of expectations and outcomes even further. Right at the intersection of evidenced-based medicine and naturopathy, for instance, we have hyperbaric oxygen therapy, or HBOT. These devices are as much in vogue among emergency departments (to treat embolisms, diabetic foot ulcers, and burns) as holistic dream salesmen (to prevent aging and cure autism, if you believe the hype). When the metric being tracked is as fluid as the visible effects of aging, answering whether the treatment is working is about as subjective as you can get.

As though the science of pharmaceuticals and clinical medicine weren’t confounding enough, you can hardly go anywhere in healthcare today without politics getting added to the mix. In the wake of Trump’s victory in the 2016 presidential election, you have observers and stakeholders asking of the Affordable Care Act (ACA): is it working?

There’s Something Happening Here

It is definitely doing something. It is measurably active in our tax policy, for instance: 2016 returns are heavily influenced by the incremental growth of the ACA’s financial provisions. Of course, the point of this tax policy (depending on who you ask) is to influence behavior. As to this point, there are some signs that, again, something is happening: among young people, ER visits in general are down, while emergency stays due to mental health illness are up. We changed how healthcare is insured, and that changed, in turn, how we access our care. But is it working?

Continue Reading

Blurring the Lines of the Scope of Practice

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar T. Wilson

When we talk about technology disrupting healthcare, we aren’t just referring to changes in the accuracy of health records or the convenience of mobile care; the real disruption comes in the form of fundamental challenges to traditional scopes of practice.

What Should We Do?

Scope of practice, broadly, is determined by a combination of liability and capability. Lead physicians carry greater liability than the bedside nurses assisting in patient care, because the care plan is directed by the lead physician. Likewise, the extra years of education and practice are assumed to increase the capacity of physicians to lead their care teams, make decisions about how the team will go about its work, and parse all of the information provided by the patient, nurses and other specialists involved with each case.

In every other industry, productivity increases come from technology enhancing the ability of individuals and teams to perform work. Email saves time and money by improving communication; industrial robotics standardize manufacturing and raise the scale and quality of output. Every device, app and system allows individuals to scale their contribution, to do more and add more value. Word processing and voice-to-text enable executives to do work that might otherwise have been performed by a secretary or typist. Travel websites allow consumers to find cheap tickets and travel packages that would previously have required a travel agent to acquire.

In healthcare, technology is changing the capacity of the individual caregiver, expanding what can be done, and often how well it can be done. These improvements, along with a growing need for healthcare professionals and services, are challenging traditional notions of scope of practice–for good and bad.

New Beginnings

Some of the changes to scope of practice are positive, necessary, and constructive. For example, technological literacy is necessary at every point in the care continuum, because interoperable EHRs and the vulnerability of digital information means that everyone must contribute to cyber security. In a sense, caregivers at every level must expand their scope of practice to incorporate an awareness of privacy, security,and data management considerations.

By extension, all caregivers are participating as never before in the advancement of clinical research, population health monitoring, and patient empowerment simply by working more closely with digital data and computers. As EHR technology iterates its way toward fulfilling its potential, caregivers and administrators are being forced to have difficult conversations about priorities, values, goals and the nature of the relationship between patient, provider, system, and technology. It is overdue, and foundational to the future of healthcare.

Is There A Nurse in the House?

The trend in healthcare toward prevention and balancing patient-centered care with awareness of population health issues puts primary care in a place of greater importance than ever. This, in turn, is driving a shift in the education of nurses to promote more training, higher levels of certification, and greater specialization to justify relying on nurses to fulfill more primary care roles. They are becoming better generalists and specialists, capable of bolstering teams as well as leading them.

Continue Reading

Doctors Need More Brains

Guest post by Edgar T. Wilson, writer, consultant and analyst.

Edgar T. Wilson
Edgar Wilson

It isn’t that doctors aren’t skilled, intelligent or capable enough—it is that the demands being placed on them are too great.

Time and documentation demands mean that something has to give. As many physicians have pointed out over the years of the HITECH Act’s implementation, the thing that normally “gives” is facetime with patients: actual, hands-on delivery of care and attention. Instead, they are driven to input data for documentation, follow prompts on EHR interfaces, ensure their record-keeping practices will facilitate correct coding for billing, as well as tip-toeing around HIPAA and the explosion of security and privacy vulnerabilities opened up by the shift to digital.

The reality of modern medicine—and especially the rate at which it evolves, grows, and becomes outdated—means that doctors need what most every other industry has already integrated: more brains. Not simply in the form of EHRs for record-sharing, or voice-to-text applications as a substitute for transcriptionists, but as memory-supplements, or second brains.

As a species, humans are also evolving away from memory as a critical element of intelligence, because we now have devices—“smart” devices—always on, always on us, and always connected to the ultimate resources of facts and data.

Our smart devices—phones, tablets, etc.—are gateways to the whole of human knowledge: indexes of information, directories of images, libraries question and answer exchanges. In effect, we are increasingly able and willing to offload “thinking” onto these devices.

Supplement or Supplant?

Depending on the context and application, this trend is both helpful and potentially harmful. For those prone to critical thinking and equipped with analytical skills, offloading some elements of memory to these devices is a question of efficiency. Even better, the more they practice using it, the more effective they become at integrating devices into their cognitive tasks. In others (those less prone to think critically), it is a shortcut that reduces cognitive function altogether: rather than a cognitive extension, the devices act as substitutes for thinking. Similarly, increasing over-reliance on the internet and search engines further diminishes already deficient analytical skills.

The standard roadmap for a medical education entails a lot of memorization—of anatomy, of diseases, of incredible volumes of data to facilitate better clinical performance. It isn’t memorization simply for the sake of recitation, though; it is the foundation for critical thinking in a clinical context. As such, medical professionals ought to be leading candidates for integrating smart devices not as crutches, but as amplifiers of cognition.

So far, that has been far from the dominant trend.

Enter the Machine

Integrating computers as tools is one thing, and even that has proven an uphill battle for physicians: the time and learning curve involved in integrating EHRs alone has proven to be a recurring complaint across the stages of Meaningful Use and implementation.

Patient engagement—another of the myriad buzzwords proliferating the healthcare industry lately—is another challenge. Some patients are bigger critics of the new, digitally-driven workflows than the most Luddite physicians. On the other hand, some patients are at the bleeding edge of digital integration, and find both care providers and the technology itself moving too slowly.

Continue Reading