By Bobby Sherwood, vice president of product development, GuidingCare.
A lack of interoperability permeates U.S healthcare. Despite the rapid adoption of new technologies, we have failed to fully realize some of the most impactful opportunities they present. Data silos that hinder collaboration, efficiency, and innovation stubbornly persist across the industry. For health plans, embracing digital transformation to digitize process and improve member experience pays dividends, but can come with difficult integration and interoperability challenges if not done properly.
There has been a recent spotlight on government initiatives and regulations to address these growing concerns. Take the new CMS proposed rule on interoperability and prior authorization, which will require payers to implement an electronic prior authorization process, shorten the time frames for payers to respond to prior authorization requests, and establish policies to make the prior authorization process more efficient and transparent.
In a world where nearly anything can be instantaneously ordered from your mobile phone or laptop and delivered overnight, it seems inconceivable that prior authorizations – something so critical to member and population health – is managed by an antiquated system. This seamless exchange of data will reduce provider abrasion, improve the member experience and potentially their health outcomes, and ultimately decrease the cost of care, as the manual effort and time linked to prior authorizations markedly decreases.
As we execute on the year ahead, interoperability remains top-of-mind for stakeholders: a new report suggests that barriers such as poor data quality and information sharing remain challenging to over 60% of healthcare executives. For health plans prioritizing interoperability, consider these three areas of focus:
By Joey Cavanaugh, RN, chief operations officer, Zotec Partners.
The American healthcare system has long been burdened by interoperability issues preventing easy access to and sharing of important patient health data. Amid the ongoing COVID-19 pandemic, those issues have created additional challenges for physicians, administrators, and other industry partners. If these problems persist, they could impact provider business models negatively.
Increased consolidation among physician groups during the pandemic has resulted in a corresponding increase in coding operations for many practices. Given the gap between the demand for coders and the trained talent available to meet that demand, organizations have increasingly shifted toward outsourcing to fill critical technical roles. The process of outsourcing these skills, combined with a surge in the number of labor hours needed to meet organizational objectives, could increase the time to code or decrease the quality of output, ultimately creating revenue cycle issues.
If not careful, staffing issues can cause fluctuations in data quality. With less personnel available to ensure the correct information is entered into the correct fields, some organizations have found it difficult to fully harness the power of healthcare solutions to streamline revenue management and operations. Moreover, understaffed facilities may struggle to make the changes to technologies and internal processes that would equip them to take advantage of government programs providing reimbursements for COVID patient care. New CPT codes for COVID have also required insurance companies to update their processes during the pandemic, adding complexities for providers in how quickly they can exchange medical information.
From a clinical perspective, the pandemic’s far-reaching impact on the healthcare system has manifested in the form of lost productivity, resource deprivation, HIPAA breaches, and other, often severe, consequences. However, the strain it has put on payers and revenue cycle management systems has been somewhat less visible from the public eye. In the concerted effort to support clinicians and mitigate the pandemic’s effects on frontline workers, the focus for getting the right data into the right hands to ensure services could be paid for in a timely manner was temporarily deprioritized. Unless these interoperability challenges are addressed as an industry, the cost of healthcare will continue to rise, as will the clunky experiences for both providers and patients.
By Lisa Esch, chief of strategy, innovation and provider industry solutions, NTT DATA Services.
The current state of our healthcare system is in disarray. Healthcare organizations are overworked and understaffed as they deal with the ongoing pandemic resulting in half of all healthcare workers reporting they’ve experienced burnout during this time.
Technology has the potential to solve these challenges, and as more digital health options become available, healthcare practitioners are using more tools that allow them to work more quickly and better serve patients.
Unfortunately, most health technology is developed in a vacuum creating silos of critical information inaccessible and unconnected in caring for patients. Disjointed and disconnected services result in key pieces of information not being available at the time and place required, and productivity can be impacted when healthcare practitioners have to navigate multiple source systems to retrieve data. In turn, this impacts the number of patients that can be seen in a given period and can potentially put human lives on the line if medically critical data is inaccessible in an emergency.
The healthcare community is beginning to embrace a solution to this problem: interoperability. Let’s explore what this means for healthcare providers and why it’s so important in the disorganized, digital healthcare system of 2022.
What’s healthcare interoperability? Why does it matter?
Interoperability services and tools bridge the gap between incompatible systems and data sets, providing a more seamless experience for both patient and provider. It has two primary definitions:
As COVID-19 reshaped American healthcare, interoperability showed real progress with care providers using shared health intelligence more than ever to make care better, safer and more cost-effective, according to the Surescripts 2021 National Progress Report. The report shows how the Surescripts network helped inform billions of healthcare decisions—making prescriptions more affordable, boosting medication adherence, simplifying the specialty medication experience, and fortifying care management processes.
“This year’s National Progress Report demonstrates nationwide momentum toward interoperable, digital health intelligence sharing,” explained Tom Skelton, chief executive officer of Surescripts. “By leveraging the Surescripts network, healthcare professionals of all kinds are getting clinical intelligence at the right time, in the right place, so that they have the trusted insights they need to serve patients.”
The healthcare industry is under intense pressure to improve its efficiency. However, interoperability between technology and various integrated systems presents many challenges that are hindering health facilities from being fully connected and productive.
We have known for years that healthcare needs solutions that artificial intelligence can provide. But the initial proofs of concept have taken too long to materialize. Without clear boundaries and use cases showing how AI in healthcare can work, leadership teams are unable to horizontally collaborate with each other.
How AI in Healthcare Could Solve Interoperability Problems
Technology has the potential to transform the way healthcare works for patients, but right now, interoperability is difficult to attain. Despite industry guides such as the Fast Healthcare Interoperability Resources, data is still a messy business. Data is stored in different ways and in different silos — and not every facility has the ability to read and understand the information contained within the respective silos and make it actionable.
This has a heavy impact on how practitioners work with technology. A radiologist reading film and a doctor making a diagnosis for a chronic pain patient only have access to their siloed expertise. With AI solutions in healthcare, data can be drawn from different disciplines and diagnosis can become faster and smarter.
When used in conjunction with AI, blockchain technology has the power to help practitioners and organizations work together without security risks. Because the blockchain represents a transparent, single source of information that cannot be changed, it can store data from multiple sources and create a harmonized picture of truth that different users can access without bias. In addition, limits can be put in place as to who has access to the data.
This helps healthcare experts form a central hub where the very best knowledge, therapies, and drug research can be pooled, therefore helping target diseases more effectively while keeping patient and research data absolutely secure and private.
It’s clear that leaders at healthcare organizations need to remove the siloed approach and develop an atmosphere of increased collaboration. But how, exactly?
How Blockchain, AI, and Healthcare Can Work Together
Blockchain technology in healthcare helps fulfill all four kinds of interoperability defined by the Healthcare Information and Management Systems Society: foundational, structural, semantic, and organizational. Blockchain’s uses in healthcare create a basis — a structure — where data can live safely and transparently. Then, blockchain can enable a rendering that helps different kinds of readers see and understand the data.
Two aspects of blockchain technology that are especially interesting to the healthcare industry are permissioned blockchains and smart contracts. A permissioned blockchain maintains the privacy of data, knows all the stakeholders, and makes data viewable by actors on the network who are authorized to see it. Smart contracts are “instructions” on the blockchain that are executed automatically once all necessary conditions or events are met. This means decisions can be made available automatically without human intervention. That’s where the power of AI’s uses in healthcare really materialize. This harmonized dataset — coupled with safe and secure automation — means that AI can be used to make faster, better, and more predictive decisions.
Data is the engine behind AI, but it’s also becoming the engine behind healthcare systems and how doctors diagnose and treat patients. If we can aggregate and translate vast amounts of data into streamlined workflows, AI can be used to efficiently diagnose and monitor patients, detect illness, accelerate drug development, and seamlessly run clinical trials.
The ingredients for interoperability are all there, but it’s now up to operators and developers to find ways to work together. The benefits of AI in healthcare are massively transformative — as long as we can find ways to solve problematic perceptions of blockchain and data privacy and get human beings to open up their silos.
No one technology will save the future of healthcare interoperability. It will take collaboration between developers, operators, academics, drug researchers, and an interwoven stack of technologies to bring together a universe of data and put it to good use.
The Cures Act Final Rule’s technical requirements call for radical changes in electronic Patient Health Information Exchange (ePHI). Care providers must adhere to the CoP requirements for patient event notifications (ADT Notifications) and the real-time exchange of ePHI through APIs in 2021. In addition, payer organizations must facilitate the electronic exchange of ePHI between other payers and healthcare providers through a patient access API. They must also provide patients with a list of care providers to choose from for medical services by compiling the provider directory API.
These technical requirements are driven by the CMS’s pursuit of seamless semantic interoperability of healthcare systems and the ONC’s specifications for 2015 requirements of Certified Electronic Health Record Technology. While they affect care providers and payers, health IT developers (HIT vendors) are the catalyst to facilitate the patient centric care.
HIT vendors must swing into action to adhere to their regulatory requirements and enable providers and payers to do so in the process. The stifling competition that is already upon them only lifts the normal for innovation and reflex time. HIT software development requires specialized skill sets and exhaustive processes that escalate costs. In a bid to rein in these costs and adhere to regulatory requirements, HIT developers tend to dilute their competitive edge.
While interoperability has always been one of healthcare’s greatest pain points, the last year or so has emphasized these challenges with the rising demand for data integration and information sharing. The pandemic has required high volumes of data integration, and it’s been difficult for organizations to adapt and respond in an effective and efficient way.
These challenges were further compounded this year with the impending ONC/CMS information blocking rules. With the previous administration’s focus on improving interoperability coinciding with a global health emergency, healthcare organizations had more on their plate than ever. As we look to the future of healthcare in a post-COVID environment, and to the new administration and its healthcare goals, what can healthcare organizations expect?
Healthcare organizations must remain flexible and optimize the organization to be as adaptable as possible. In our interview with Ivan, we explore what healthcare organizations should know about the information blocking rules and the new administration, what is really at the root of the healthcare interoperability problem, and best practices healthcare leaders can employ to set their organizations up for success now and in the future.
How would you define the healthcare interoperability problem?
Interoperability is an evergreen problem across the healthcare industry. As we continue to innovate new capabilities and concepts, we are also constantly expanding our interoperability needs. In a way, interoperability isn’t a problem to be solved. It’s an ongoing practice that has to evolve alongside our other capabilities. For example, there was a time not long ago when social determinants of health (SDoH) were not on anyone’s radar, but as SDoH became more important to healthcare practitioners, it was clear we needed not only to track and store SDoH-related data but also exchange that data across different software systems and organizations. The goal of HL7’s Gravity Project is to build out the standards for exchanging SDOH data using FHIR.
2020 was a tough year in healthcare. The demand for data integration was up, exposing the dire need for better data integration across the healthcare ecosystem. In a world where interoperability wasn’t an issue, how could the pandemic have looked different?
The bad news is that we live in a world where the most reliable COVID vaccination records are stored on paper cards and interoperability is achieved by the patient themselves carrying the card from place to place. In an ideal world, the vaccination would come with an electronic record that the patient could capture on their mobile device and upload to their doctor’s EHR system, their employer’s HR system, and any other third party that needed to see proof of vaccination.
Although we’ve fallen far short of the ideal state, there are some interoperability bright spots to be happy about. For example, we’ve been able to onboard many new sources of lab result data and integrate that into public health departments. This has not always been easy, but because of the ONC’s prior work on the Promoting Interoperability program, we already had agreed-upon standards and an infrastructure in place to move the data from location to location.
By Matthew Oldham, vice president of engineering, Graphium Health.
Over the course of my career, working in a variety of industries, I have developed certain design patterns when modeling data that guide my approach to tackling a new data domain. One simple example is how I choose the right data type for a given value an application will capture.
While it may sound straightforward, interesting nuances can quickly surface during the data modeling step that necessitate a shared language and vocabulary between the functional experts and the software engineers. In other words, we need to figure out how to work together and speak the same language in order to solve the problem well.
The importance of nuanced semantics may be illustrated with the example of how an anesthesiologist documents the administration of an antibiotic. . The type and timing of antibiotic administration is a key metric that anesthesia providers have historically had to report to Centers for Medicaid and Medicare Services (CMS) since it correlates with both patient outcomes and healthcare costs.
As I analyzed the paper anesthesia record used, I noticed an “antibiotics” checkbox, accompanied by an antibiotic name, an amount, a unit of measure, and the route of administration. These all made sense to me, and I proceeded to incorporate these concepts into my data model. For the antibiotics checkbox, I naively interpreted it as a simple boolean value, and I named it Antibiotics Administered Indicator. In my mind, that simply indicated that the antibiotic denoted on the form was either administered (true), or not administered (false).
During a review of the model, I learned that a clinician interprets this checkbox to mean an “indication for antibiotics”; in other words, antibiotics were or were not determined as a necessary course of action given other clinical conditions. A true value didn’t mean that antibiotics were administered, only that they were indicated, and thus needed to be given. That is obviously a completely different understanding than the one at which I had arrived. Needless to say, this was eye opening for me, even having been down the road of developing a functional understanding of data domains many times before.
The illustration highlights the importance of having both functional (i.e., the doctors) perspectives and technical perspectives present and engaged during software design. A purely technical survey of a subject area will certainly be valuable, and in some cases may provide decent coverage in terms of establishing a foundational understanding of that domain. In most cases, however, a functional perspective will also be required to complete the picture and add the necessary insight required to create an accurate and intuitive user experience.
In fact, healthcare may serve as the poster child for just how challenging, complex, and unforgiving software design can be. Clinician dissatisfaction and fatigue with existing electronic health report software is well documented, and the explanations are plentiful: failed interoperability, difficult user experience, inefficiency with simple tasks, onerous data capture burden, etc. Perhaps the common denominator is a failed understanding of complex and poorly defined clinical workflows being interpreted and standardized in software by technical experts working in isolation. The real issue here is that foundational errors propagate as the software evolves, and there is no easy way to reverse course once construction begins.