Guest post by Edgar Wilson.
We put a lot of faith in health technology: to make us better, to save our systems, to revolutionize healthcare. We may be looking at it from the wrong side entirely.
The social determinants of health matter more than our ability to deploy doctors or provide insurance; physical and mental, health is always more social than clinical.
But most of our health tech that is supposed to be revolutionary is aimed at clinical factors, rather than the social determinants of health. Yes, telehealth can increase reach, but it is still just a matter of touchpoints, not a fundamental change to the lifestyles and cultures that determine health.
Same with all our EHR systems creating more ways to record information, more ways to quantify patients, to put more emphasis on engagement and quality-based reimbursement. Even genomics and personalized medicine are taking a backseat to soliciting reviews and trying to turn the patient experience into a number. It all puts greater focus on the clinical encounters, on how patients “feel” broadly about each minute aspect of their time in the medical facility.
A Digital Disease
As politicians trade blows on minimum wages and the ACA, the likelihood grows that insurance benefits and livable incomes (and lifestyles) will get pushed further out of reach for more people.
Modern work is tech-centric, which means lots of sitting, and manages to facilitate increased snacking without being particularly physical, a double-whammy that prevents employment or higher incomes from leading to healthier choices. For the less-skilled, normally accessible jobs are in the sights of automation and disruption. While tech is taking over medicine and opening up new possibilities, it is also transforming the labor market and closing countless doors to workers.
By extension, technology is changing the social framework that determines public health. Income inequality is growing, wage growth is stagnant, and no amount of awareness can change these front-of-mind concerns for people who may well want to eat better and exercise more, or even commit to seeing the doctor more often and following his or her advice to the letter.
Poor people can’t necessarily eat better as a simple matter of choice or doctor’s orders. Planning meals and purchasing healthful foods is a tax on limited resources–time as well as money. Working three jobs to pay the bills, many lower income individuals also don’t necessarily have time to exercise. And more likely than not, those working even high-paying jobs are sitting all day, sapping their bodies of energy and resilience, undoing the good of their intentions and smart devices alike through attrition.
The one bit of technology that shows any promise for changing social determinants of health is wearables, if only because they are meant by design to remain with patients. These could have an impact on individual health by encouraging exercise or healthier eating habits — but even then, only if people commit to using their devices, and even more importantly, have the bandwidth and resources to act on their recommendations. So we are right back to low-tech, socially-focused solutions.
Between Knowing and Doing
We’ve ascribed a lot of cultural factors to the growth of health disease in America, but the clear culprit is a broad social shift to tolerate poor diet as a natural compromise of busy lifestyles and modern conveniences. That new normal has been compounded for a generation by technology’s influence.
Millennials, as a general trend, grow up to be more globally aware and concerned with healthful eating, thanks to a confluence of factors — environmental awareness, local identity, the fashion of food, and an inconsistent yet undeniable anti-corporate tendency. But they are also more likely to delay seeing a doctor, and frankly, more likely to have student debt and lower overall purchasing power than older generations.
So the youth are becoming more conscious of health inputs and balanced lifestyles, yet have an inhibited capacity to act on this, because of their depressed purchasing power. They have also grown up during an unprecedented golden age for junk food, convenience meals, and generally poor nutrition. The culture they are making may be health-conscious, but the one they were born into (and raised in) was definitely not. Add in a near-dependence on ubiquitous digital technology, and you get a generation suffering alarming rates of chronic pain and chronic depression in tandem.
For all the hope we put on our technology to save us, we’ve already used it to double-down on a social system that hurts us.
Back to Society
The health system is often operating in spite of the social determinants of health. Anyone can go to the ER and get stabilized; not everyone is able to keep themselves stable once they leave, because our culture and social norms easily overwhelm the discrete reach of clinics. In workplaces and homes, we’ve allowed new technology like cellphones to further entrench poor health habits, rather than combat them. Wearables are as yet a mixed effort at changing this.
Now, our healthcare organizations threaten to make a similar mistake, putting increased focus on the digital signals that come out of health encounters, rather than recognizing the social determinants that led patients into them. We might be better off blending primary care with social work than with a government-sponsored Yelp!
Our new tools change the dressing, but not the fundamental operations of our healthcare system. Creating more incentives to increase their integration hasn’t changed their ultimate function, which is to facilitate clinical business as usual. When we allow wearables, EHRs, and telemedicine to extend caregiver reach and patient compliance, and to follow the social determinants of health rather than just measuring known values in wellness, we might have a shot at using technology to save ourselves.