In the face of this growing healthcare demand, the supply of medical generalists has been consistently trailing the supply of specialists. By 2030, a study from the Association of American Medical Colleges estimates a shortfall of between 14,800 and 49,300 primary care physicians, as well as a shortage in non-primary care specialties of between 33,800 and 72,700 physicians.
Compacting this issue, the U.S. population is estimated to grow nearly 11 percent by 2030, with those age 65 and older increasing by 50 percent. As physicians begin to retire too, this problem will be exacerbated.
While digital technology has been positively impacting access to healthcare services for quite some time, efficiencies, such as virtual care, need to be implemented widely in order to address the impending physician shortage, and maximize the delivery of quality care.
This implementation will be somewhat natural as patient access and services continue to evolve from live voice interactions to leveraging digital solutions. Several healthcare providers have made this step toward virtual care already, and are showing strong results for patient satisfaction.
A virtual visit pilot program conducted by Brigham and Women’s Hospital found a 97 percent satisfaction rate among patients with access to these new communications and care options, with 74 percent stating “that the interaction actually improved their relationship with their provider.” They also found that 87 percent of patients said they would have needed to come into the office to see a provider face to face if it weren’t for their virtual visit.
Kaiser Permanente Northern California (KPNC) have a similar offering, providing a suite of apps enabling members to exchange secure messages with their clinicians, create appointments, refill prescriptions, and view their lab results and medical records. As a result, the number of virtual visits has tripled to 10.5 million over last six years.
At Valley Health, a tele-ICU has provided a viable solution to reduce mortality rates. During the first year of its implementation, the technology helped save 125 lives, reduce ICU length of stay by 34 percent, and also reduce the sepsis mortality rate.
These examples show how virtual care can aid patients for when they first need help, but the care journey does not stop there. It continues with prescriptions, labs, imaging, and referrals to other care providers. In these instances, virtual care can be used as a follow-up and check-in tool so that patients no longer need to visit their physician in-person, they can quickly interact with them from the comfort of home.
America’s healthcare system is notoriously disjointed, with patchwork information technology and disparate data. The quest for interoperability as an answer – the ability to easily share health records between sites of care – has had varying levels of success. But it’s a solution that is crucial to healthcare’s value-based transformation and can’t be allowed to fail, especially by going too slow to be meaningful.
The challenges associated with interoperability – from fragmented sources of patient information to data being kept in varied formats, to difficulties using an electronic health record (EHR) as a secure central “home” for a patient’s data – were highlighted in a recent American Hospital Association (AHA) report on “The Hospital Agenda for Interoperability.”
The report highlighted challenges for providers and underscored that a collaborative approach is necessary for improving the lives of families and their caregivers for the long-term. The AHA report not only represents frustrations with EHR and Health Information Exchange (HIE) but calls for extending efforts at digital transformation that logically layer on top of that.
As the debate and progress inches forward to 2020, it is vital to return to what hospitals are telling us.
Beyond technical challenges
The difficulty in interoperability goes beyond technicalities. For example, while healthcare providers generate clinical data and payers create claims data, their current structures are not conducive for synchronization and insight generation. Payers and providers have different objectives (whether it’s clinical notetaking, billing, clinical decision support, etc.) so there’s more to the underlying friction than just a variety of data formats.
The AHA calls out additional reasons for issues with interoperability and the high costs that result. That includes expensive workarounds, overcomplicated user interface design, lack of documentation consistency, unrealistic expectations for technical solutions, issues with regulatory compliance for data security, privacy and use, and pricing models that “toll” information sharing. It’s also clear that business and technical challenges with interoperability should not be conflated – each technology Band-Aid further burdens healthcare organizations with ad hoc, un-intuitive technology that will cost more over time and fail to solve interoperability challenges.
In the quest to use technology to save money, improving interoperability between healthcare systems and using powerful data analytics to extract insights from systems working in concert appears to be an expensive and elusive goal. But the cost of not committing to a more universal approach is far greater, because it prevents clear and effective insights on where improvements are needed. Insights are only ever as good as the data that feeds them, and if there isn’t a clearly defined data analytics and data governance strategy aligned with industry best practices, the results will be half-baked.
The need for leadership
Leaders who may have not been exposed to technology outside of healthcare may not be aware of open source, cloud-based tools to rapidly and more cost effectively meet interoperability needs. As they implement piecemeal or outdated, point-to-point solutions, technology costs rise significantly, while perception of technology’s efficacy goes out the window. Worse, paying to ship data to third-party vendors instead of focusing on an internal and overarching connectivity strategies involving every vertical means opportunities to maximize their data’s potential will be lost.
In other words, healthcare organizations might find that they are ironically spending immense amounts of money on technology intended to reduce costs under a value-based payment system. But they might not be spending it wisely. And as other industries steam ahead with movements akin to interoperability on a national scale, leaders in healthcare need to stay focused on the larger task of consistency as not to fall even further behind. Unlike industries like retail, the quality of lives for patients and their caregivers are at stake.
Technology – and business – challenge
As the AHA report emphasizes, the challenges involved go beyond technology and land firmly in the realm of business. Healthcare organizations need to be open to making health data available, whether it’s a secure transfer to another provider, or to the patients. CMS recently renamed “meaningful use” to “promoting interoperability” in efforts to provide a model and further incentives for “advancing care information.”
In the United States, there are approximately 40 million patients with kidney disease; 450, 000 of these patients are on dialysis. However, dialysis treatments can be lengthy, expensive, and not fully covered by insurance – leading to many complications for kidney patients. In addition, dialysis is not a good long-term solution for kidney disease, leading to the development of several other diseases and conditions. Why do kidney failure patients need dialysis, and how can we prevent the prevalence of kidney complications in the United States?
What is dialysis?
Our kidneys are an organ that our body uses to filter waste from foods, medications, and toxic substances that may enter our system. They make urine to remove wastes from our blood and extra water from our body to help us stay healthy. Kidneys are also important in maintaining your overall fluid balance and regulating the minerals that circulate throughout your body. They also produce helpful hormones that control red blood cells, bone health and blood pressure regulation.
When your kidneys shut down due to kidney complications, you cannot maintain these key bodily functions. More than 8 million Americans suffer from chronic kidney disease, according to the nonprofit Wait List Zero. Of these patients, approximately 450,000 are undergoing dialysis, the only treatment that can keep them alive. Dialysis is a treatment that replaces some kidney functions by:
Regulating blood pressure
Removing waste, salt, and excess water to prevent buildup in your body
Regulating levels of certain chemicals in your blood, such as bicarbonate, potassium, and sodium
The current cost of dialysis
Dialysis is a lengthy and intensive procedure, with many patients attending three sessions a week at four hours each. It only performs about 10 percent of a kidney’s function, making it a short-term life-saving solution that is not suitable for long-term treatment. Only 33 percent of dialysis patients live past five years. One in four patients passes away within 12 months.
We live in a world where medical errors are the third leading cause of death behind cancer and cardiac disease, leading to more than 200,000 preventable deaths every year. We have an aging population growing at an unprecedented rate: 8.5 percent of people worldwide (617 million) are aged 65 and over, and this percentage is projected to jump to nearly 17 percent (1.6 billion) by 2050, leading to an anticipated physician shortage of more than 50,000 by 2025.
On top of all of this, healthcare costs are projected to increase to over 25 percent of GDP in the United States by 2025. The convergence of these events is pushing the entire industry to begin leveraging technology more than it has in the past.
Many of these challenges can be remedied by leveraging industrial IoT (IIoT) technology that’s been proven to solve similar challenges in other industries. Could an interoperable, connected healthcare platform that applies the principles of an IIoT connectivity architecture to share data throughout the healthcare system be the cure for our ailing healthcare system?
West Health, now the Center for Medical Interoperability, seems to think so. In 2013 they published a report showing how an interoperable, connected healthcare system could provide nearly $30 billion in industry savings while improving patient outcomes in the process. These connected healthcare platforms provide the foundation for innovation that is needed to make a meaningful data-driven change in healthcare. It’s these platforms that open the door to application developers everywhere to create modality-specific applications using artificial intelligence and machine learning.
So what exactly is a connected health platform and how does it provide a foundation for transformational change in healthcare? First, a connected health platform consists of hardware (gateways and servers) and embedded software components that are designed to take all of the data from any medical device (clinical or remote) and convert the data in a single usable format that gives providers access to a complete data set.
This connected platform will provide a variety of user interfaces, analytics and clinical applications to help users throughout the healthcare ecosystem distill value from this newly-gathered data. The applications range from the early detection of sepsis, to predicting cardiac arrest, to providing business analytics like bed and device utilization.
The connected health platform will become the center of an ecosystem for further application development, similar to that of an online app store — but with built-in medical-grade safety and security. The connected health platform must ensure data security and patient privacy by aligning to guidance provided by the FDA on cybersecurity, and meeting the standards defined by HIPAA.
However, these connected health platforms are only as effective as the data they capture, which is determined by the connectivity frameworks they are built upon. Many of the currently deployed platforms are not platforms at all, but a collection of disparate systems that provide silos of individual device data. These legacy systems have been built using internally-developed, proprietary, message-based communication technology.
As the first step towards the development of a connected health platform, modern web services-based communication has been deployed on top of the legacy technology to begin integrating all of the disparate data streams via onsite data centers or the cloud. Although this is a step in the right direction, these platforms are far from complete. Because of legacy communications infrastructure they are built upon, they are only able to aggregate a portion of the data making these systems a poor fit for true near-patient, real-time clinical decision support – the key to efficiently providing improved patient outcomes.
The EMEA market for electronic health record (EHR) IT is estimated to have been worth $3.7B in 2018 (both acute and ambulatory applications) according to Signify Research’s 2019 global EMR market report. However, only one vendor, Cerner, is estimated to have had a double-digit revenue share in 2018. In terms of suppliers the EMEA market is highly fragmented with a mixture of local and international vendors addressing individual countries with few vendors having a truly region-wide footprint.
The table below shows estimated revenue shares in 2018 for the acute & health system EHR market in EMEA (excluding revenues for ambulatory only solutions), alongside the countries/sub-regions where each vendor had a significant share of its EMEA business.
Acute/Network EHR Estimate Revenue Share – EMEA – 2018
2018 Rank
Company
2018 Revenue Share
Key Countries/Sub-Regions
1
Cerner
16%
DACH, UK/Eire, Middle East
2
Agfa Health
8%
DACH, France
3
Asseco
5%
Eastern Europe
4
CompuGroup Medical
5%
DACH, Nordics, Eastern Europe
5
InterSystems
4%
UK/Eire, Middle East, Spain/Portugal
6
Chipsoft
4%
Benelux
7
DXC Technology
3%
Benelux, UK/Eire, Spain/Portugal
8
Tieto
3%
Nordics
9
Dedalus
3%
Italy, France, Spain/Portugal
10
Nexus
3%
DACH, Benelux
11
Engineering Ingegneria
3%
Italy
12
Systematic
2%
Nordics
13
Epic
2%
Benelux, Nordics, UK/Eire, Middle East
14
Telekom Health
2%
DACH
15
Maincare Solutions
2%
France
Others
39%
Source: Signify Research
Note: Does not include ambulatory only revenue/vendors
In a drive for standardization, the proposed U.S. Core Data for Interoperability1, mandated by the 2015 21st Century Cure Act2, calls for the identification and refinement of a basic set of clinical data to be required for all electronic health records. The American Medical Informatics Association, however, recommends that healthcare researchers and providers concentrate on sharing data as quickly as possible, deferring the creation of standards until sometime in the future.3 “This is a false choice and a distraction,” said James D’Arezzo, CEO of Condusiv Technologies.
D’Arezzo, whose company is the world leader in I/O reduction and SQL database performance, adds, “What the healthcare industry really needs to focus on is enabling its heavily overburdened IT infrastructure to do the job it’s being asked to do.”
The industry’s preoccupation with interoperability and standardization, notes D’Arezzo, is perfectly understandable. Turf wars over proprietary interfaces and protocols are having a major impact on healthcare IT budgets.4 Non-compatible electronic health records contribute significantly to the fact that computerized record keeping consumes more than 50 percent of the average physician’s workday, which now stretches to more than 11 hours.6 Healthcare organizations struggling to process this tsunami of data are frustrated by the number and variety of analytics tools they are forced to use.6
Supporting all this activity, however—unnoticed and, says D’Arezzo, dangerously neglected—is the basic computational machinery itself. Data analytics requires a computer system to access multiple and often far-flung databases, pulling information together through millions of individual input-output (I/O) operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.
According to experts, the most widely used operating system, Microsoft Windows, is in many ways the least efficient. In any storage environment, from multi-cloud to a PC hard drive, Windows penalizes optimum performance because of server inefficiencies in the handoff of data to storage. This is a problem that, untreated, worsens with time. The average Windows-based system pays a 30 percent to 40 percent penalty in overall throughput capability because of I/O degradation.7
By Karen Way, global practice lead for data and intelligence, NTT DATA Services.
A recent study conducted by NTT DATA Services and Oxford Economics highlighted the top three challenges identified by healthcare executives and consumers: standardizing and sharing of data across the healthcare spectrum, preparing for and adapting to regulatory changes and recruiting or retaining the right resources. It’s understandable that these challenges rise to the top of the list, as trying to meet rising consumer demands for access to their healthcare data while maintaining regulatory compliance with limited resources is a bit like juggling raw eggs. If your timing or skills are just a bit off, you end up with egg on your face.
How can a healthcare organization address these challenges? First, it’s important to understand exactly which challenges are present within your own organization, and how they are impacting the patient experience.
Data Sharing
Study results showed that only 24 percent of healthcare organizations share data across the business. Why? There are several reasons:
Interoperability – Even though this concern has been expressed since the inception of the EHR/EMRs, there are still barriers to being able to communicate data between different systems in a consumable, usable format. For example, several years ago, I had a CT scan due to the sudden onset of a continuous migraine headache. After the scan, I was referred to a neurologist that practiced out of another hospital system. I took a copy of the scan (on a DVD) to my specialist appointment, but the doctor was unable to view it on her system. As a result, I had to have another CT scan for the neurologist to see what may have been happening. Data already collected could not be used due to lack of standardization across systems. As noted in Dr. Eric Topol’s book Deep Medicine, “Your ATM card works in Outer Mongolia, but your electronic health record can’t be used in a different hospital across the street.”[1]
Data Volume – The volume of data being generated daily in the healthcare industry has been estimated to be approximately 30 percent of the world-wide total. With the estimates of data generation rates of at approximately 2.5 quintillion (that’s 1, followed by 30 zeros) bytes/day, that’s a lot of healthcare related data. Healthcare organizations aren’t even beginning to tap the depths of this data, simply due to the data volume.
Disparate Data – Like patients, healthcare data comes in many different shapes, sizes and languages. Even if interoperability issues didn’t exist, sharing of data across the healthcare business is hard because of these differences. Data can be an image, a PDF, a written note, a prescription label, etc. Historically, each type of data requires different mechanisms for managing it, often using different tools or systems.
These three factors combined can be overwhelming for healthcare organizations whose main goal is to provide the best healthcare possible.
Resource Management
Another leading challenge identified in the study is that of recruiting and retaining skilled resources. Of the healthcare executives surveyed, only 51 percent stated that they were able to recruit and retain resources with the required skills and knowledge. There are two components to this issue:
Upskilling resources – With the continual advancements in technology, it is important to ensure that resources can take advantage of training opportunities and professional growth. This is often a delicate balance for organizations; time spent in training is often seen as time away from projects with deadlines.
Healthcare experience – While there may be a pool of qualified resources with the required technical skills, it can be hard to find resources with knowledge and/or experience in the healthcare sector. For example, several years ago, a client brought in resources to support their enterprise data warehouse that had extensive experience in data warehousing. There were high expectations of new and improved functionality due to the technical depth of these resources. Unfortunately, the project did not deliver as expected. Why? None of the resources had knowledge of healthcare data or business processes. One resource posed the question to the client: “what is a healthcare claim?”
Just as important as reviewing the patient experience for ways technology can solve the problem, it helps to treat your workforce like your customer and improve the experience of transforming the organization into a digital-first enterprise. A recent article in the Wall Street Journal highlights the difficulty in recruiting resources and approaches to solving this challenge. The article also reinforces that in today’s data economy, it is no longer enough that a resource be technically skilled, they must also have knowledge of the business environment in which they are applying the technology.
By Jessica Kojima, vice president of professional services, AdvancedMD.
Medical providers often battle inefficient and piecemeal software configurations that constitute the technological lifelines of their practices. Even if current solutions aren’t integrated to meet their needs, they sometimes feel it is easier to stick with what they’ve got instead of considering the possibility of upheaval.
Staying current—or better yet, striving for innovation—often comes at a price of temporary disruption for providers and staff. But if a new electronic health record (EHR) and practice management (PM) system are implemented with proper planning and stakeholder buy-in, the practice can realize true efficiency without substantial disruption.
Here are three best practices for implementing a new EHR or PM system for medical practices to minimize disruption.
Consider the big picture
Before the EHR/PM implementation process kicks off, the practice should assess the effectiveness of its clinical and administrative workflows. A practice that wants to cram a new EHR or PM into an existing workflow that isn’t working is thinking short-term. It’s important to consider “big picture” goals and aim to improve bad processes and optimize productivity. An effective approach will realize efficiencies across an entire practice versus one function or role. By considering the importance of productivity optimization across the whole operation, a holistic and integrated approach will enable growth and improvement for both patients and staff.
Preparing for payer migration
While considering the practice’s overall goals, the practice must also perform a payer analysis. A new PM system will require new Electronic Data Interchange (EDI) agreement enrollments to electronically submit and receive claims, remittance, eligibility, and claims status, which differ from payer to payer. This is often a forgotten aspect of moving to a new practice management system, and is arguably the most critical change. The practice should have the following items ready well before implementation:
List of payer IDs as they are credentialed. This information can be found by reviewing how the practice submits and receives electronic transactions currently, or by contacting the payer directly.
The official name as credentialed.
Understand if the providers are credentialed as a group or provider.
Tax ID.
National provider identification (NPI) number.
Many payers also require online registrations, voided checks, letters on specific letterhead—to name a few—to process an agreement. This demands timely attention by a designated individual in the practice for a seamless enrollment process. The practice should also set a “go-live” date to start submitting transactions through the new system. Preparing adequately in the weeks ahead of implementation will enable a timely kickoff of the new system.
Focus on people
Instinctively, an EHR and PM implementation might seem to be all about the software technology. Not so: it’s about the people. It’s vital that a medical practice identifies an implementation champion who will be responsible for both supporting and managing the transition. Depending on how aggressive the go-live goal is, the practice may need a practice management and a clinical champion to ensure all of the configuration and training occurs simultaneously. The champion’s role is to listen to every staff member affected by the transition and communicate openly with the team about why the switch is being made and how the leadership will support them. Ideally, everyone agrees on the approach, objectives, and expected outcomes.