Any patient matching improvement strategy must look beyond technology and emphasize the people and processes that play a critical, yet often overlooked, role in ensuring data integrity. That was the message Just Associates, Inc., shared with the Centers for Medicare and Medicaid Services (CMS) and the Office of the National Coordinator for Health IT (ONC) in response to requests for information on patient matching in conjunction with proposed rules to advance interoperability.
In its letter to the ONC, Just Associates noted that while it supports the focus on interoperability and usability and agrees with the importance of accurate patient matching, it does not believe that the concept of a “technology alone” solution is realistic. Any improvement strategy must also include data standardization and promote a more consistent, comprehensive collection of patient data at all entry points.
“Every technology has its flaws when it comes to patient matching and the importance of training staff, developing and maintaining comprehensive data governance policies, ensuring executive support for data governance and vigilant efforts on measuring and reporting data quality are critical. We cannot ignore the ‘people and process’ aspects to obtaining high levels data quality,” the letter stated.
Just Associates also provided feedback to the ONC on the importance of consistently defined and used format constraints and identified key issues that must be addressed to accurately measure algorithm performance. The suggestion was also made to align with the Children’s Hospital Association’s temporary demographic conventions for newborns to address the unique challenges with pediatric matching.
In its letter to CMS, Just Associates concurred with the suggestion that more standardized data elements be used across all appropriate programs to immediately enhance matching rates, noting that “data collection standards and their consistent application by health plans, providers and exchange organizations are critical for matching accuracy.
“Equally important,” the letter continued, “is the development of data definitions for these elements to ensure common understanding of exactly what data is being collected and in what format.”
Other advice offered by Just Associates in response to CMS suggestions included avoiding mandating the use of specific matching algorithms, data sources or software solutions, a move that would likely be premature and overly prescriptive. The firm also stated its support for implementation of a CMS-wide identifier, noting its potential to enhance accuracy and assist in duplicate record reconciliation verification processes.
By Beth Haenke Just, MBA, RHIA, FAHIMA, founder and CEO, and Karen Proffitt, MHIIM, RHIA, CHP, vice president of industry relations/CPO, Just Associates, Inc.
The introduction of overlays into a medical record system can be so subtle that they often go unnoticed until one causes an adverse event, HIPAA violation or billing error—making them a primary source of patient errors, expenses and lost revenues in hospitals today.
Caused when the information of two patients is co-mingled within one medical record, the dangers of overlays have intensified with the proliferation of electronic health record (EHR) systems, which accelerate the rate at which multiple internal and external systems can be infected with dirty data. Compounding the problem is an overreliance on technology-centric solutions to resolve possible duplicates.
The American Health Information Management Association (AHIMA) puts the average duplicate rate at between 8 percent and 12 percent. A more recent survey by Black Book found an average of 18 percent. Meanwhile, an analysis of EMPI cleanup projects Just Associates completed between 2012 and 2016 showed that as many as 1.3 percent of these possible duplicates are actually potential overlaid records.
When it comes to overlays, there are three challenges facing health information management (HIM) professionals tasked with maintaining the integrity of patient records: 1) identifying and resolving existing overlaid records, 2) determining the root cause(s) and 3) implementing policies and procedures that will prevent the creation of new ones.
The birth of an overlay
The most common way an overlay is created happens at the time of registration when an incorrect patient record is selected, core demographic information is changed, and a new visit is added. Occasionally, the records of two different patients are erroneously merged during the duplicate resolution process.
Overlay creation can also be traced back to multiple departments. A study in the Journal of AHIMA involving an eight-hospital, multi-state healthcare organization found that most of the errors happened in the emergency department (ED) and, to a lesser extent, in registration, scheduling and ancillary areas such as lab and radiology.
The hospital system that was the subject of the study had been tracking and keeping detailed statistics on overlay errors for five years, beginning with the implementation of an EHR system. This provided researchers with the rare opportunity to analyze a considerable sample size of 555 errors, from which they determined an error rate of one in every 10,734 admissions. That is the equivalent to more than nine errors per month, of which 97.5 percent were caused by user oversight. The study also identified an upward trend in overlays attributed to growth of the health system and higher utilization of error identification tools that reveal more issues than manual methods.
For example, 54 percent of overlays were found by registration users while data integrity change reports that made use of EHR tools found 31 percent. Clinicians were a distant third, identifying just 6 percent of errors. Patients also found overlay errors via patient portals, which could have allowed them inappropriate access to highly-sensitive protected health information (PHI) — access that could lead to HIPAA violations.
Proactive EHR tools found most overlays within 10 days of their occurrence, and most were corrected in 30 days. This is important because the longer an overlay goes undetected, the less likely it will be found. When it is found, the older overlaid record is much more time-consuming and expensive to correct.
The high cost of overlays
To determine just how costly overlays are, it is necessary to cast a wide net, as few studies have been done to establish industry averages. Factors contributing to the full financial impact of an overlay include denied and delayed claims, lost revenues and resources required to identify and correct the error.
Time is a huge factor in the costs associated with overlay correction resources. For paper-based overlays, it can take between 60 and 100 hours, while EHR-based errors can take months depending upon system complexity. A survey by the College of Healthcare Information Management Executives (CHIME) further found that respondents typically had at least two people dedicated to “data cleansing,” including overlay correction.