Measles continues to spread in the United States as health officials seek to stem the worst outbreak of the disease in decades. More than 700 cases have now been reported, about half if them involving children under the age of five.1
James D’Arezzo, CEO of Condusiv Technologies, says, “It is the job of our healthcare database networks to map a situation like this in order to help caregivers control it.” D’Arezzo, whose company is the world leader in I/O reduction and SQL database performance, adds, “Unfortunately, some pieces of this network are missing, and a number of others don’t work very well.”
Experts in the field agree. According to a recent report by team of scientists led by the National Institute of Health, while analysis of data derived from electronic health records, social media and other sources has the potential to provide more timely and detailed information on infectious disease outbreaks than traditional methods, there are significant challenges to be overcome. Big data offers a “tantalizing opportunity” to predict and track infectious outbreaks, but healthcare’s ability to use it for such purposes is decades behind that of fields like climatology and marketing.2
Nonetheless, progress in data sharing has been made. State, local, and territorial health departments now have access to healthcare-associated infections data reported in their jurisdictions to the Center for Disease Control’s National Healthcare Safety Network (NHSN). Thirty-three states and the District of Columbia now use NHSN for that purpose.3
However, D’Arezzo notes, this data has its origins in a multiplicity of far-flung healthcare organization IT systems. To be usable, it must be pulled together through millions of individual input-output (I/O) operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment. The most widely used operating system, Microsoft Windows, is in many ways the least efficient; the average Windows-based system pays a 30 percent to 40 percent penalty in overall throughput capability due to I/O degradation.4
As health IT continues to mature and providers continue to adopt technologies like electronic health records, the data collected from their use in the care setting becomes the most obvious reason so much energy is being put behind getting practices to implement the systems.
Judy Hanover, research director of IDC Health Insights, recently told me, though, that one of the biggest challenges faced by ambulatory and hospital leaders is that the data entering the electronic systems, in most cases, is unstructured, which makes it almost useless from an analytics standpoint.
Without structured data, Hanover said, quantitative analysis across the population can be complicated, and little can be compared to gain an accurate picture of what’s actually taking place in the market. Without structured data, analytics is greatly compromised, and the information gained can only be analyzed from a single, siloed location.
“There must be synergy between the data collected,” Hanover said. “We’re entering the period of structured data where we’re now seeing the benefits of structured data but still need to manage unstructured data.”
In many cases, critical elements of data collected — like medications, vitals, allergies and health condition — are difficult to reconcile between multiple data sources, reducing the quality of the data, she said. Unstructured data proves less useful for tracking care outcomes of a population’s health with traditional analytics.
For example, tax information and census data are collected the same way across their respective spectrums. All the fields in their respective fields are the same and can be measured against each other. This is not the case with the data entering an EHR. Each practice, and even each user of the system, potentially may collect data differently in a manner that’s most comfortable to the person entering the data. And as long as practices continue to forgo establishing official policies for data entry and requiring data to be entered according to a structured model, the quality of the information going in it will be a reflection of the data coming out.
Lack of quality going in means lack of quality coming out.
“In many cases, structured data is not as useful for analytics as we’d hoped,” Hanover said. “There are inconsistencies in the fields of data being entered in to the systems; and that affect data quality as well as results from analytics.
“As we move into the post EHR era, how we choose to leverage the data collected is what will matter,” she said. “We’ll examine cost outcomes, optimize the setting of care and view the technology’s impact.”
As foundational technology, EHRs are allowing for the creation of meaningful use, but once the reform is fully in place, the shift will focus on analytics, outcomes and benefits of care provided.
Currently electronic health records define healthcare, but health information exchanges (HIE) will cause a dramatic shift in the market leading to further automation of the providing care and will change how location-based services and clinical decision making are viewed.
Though some practices are clearly leveraging their current data, others are not. For them, EHRs are nothing more than a computer system that replaced their paper records and qualified them for incentives.
In the very near term, the technology will have to have more capability than simply serving as a repository for information collected, but will become a database of reference material that will have to be drawn upon rather than simply housed.
“Health reform is the end game,” Hanover said. “And there can be no successful reform without EHRs. They are the foundational technology for accountable care.”
The data collected in this manner will lead to a stronger accountable care model, which will once again bring the practice of care in connection with the payment of care.
Evidence-based approaches will continue to dominate care when the data suggests certain protocols require it, which means insurers will feel as though they are working to control costs.
Unfortunately, all of the regulation comes at an obvious cost at the expense of the technology and its vendors, said Hanover. EHR innovation continues to suffer with the aggressive push for reform through meaningful use as vendors scramble to keep up with requirements.
“There’s little or no innovation because all of the vendors are being hemmed down by meaningful use and certification requirements,” she said.
Product standardization means there are far fewer products that actually stand out in the market.
More innovation will likely only come following market consolidation in which only the strong will survive. Hanover suggests that in this scenario, survivors will focus on innovative product research and development and will take a leadership role in moving the market forward
Though vendors will suffer, users of the systems will likely face major set backs and upheavals at the market shifts and settles. Especially as consolidation occurs, suppliers disappear or change ownership, practices and physicians using these systems face the toughest road as they’ll be forced to find new solutions to meet their needs, learn the systems and try to get back to where they were in a meaningful way in a relatively short period of time.
Likely, deciding which system to implement may bear just as much weight as deciding how to use it.