Hardly a day goes by without some new revelation of a US IT mess that seems like an endless round of the old radio show joke contest, “Can You Top This”, except increasingly the joke is on us. From nuclear weapons updated with floppy disks to needless medical deaths, many of which are still caused by preventable interoperability communication errors as has been the case for decades.
According to a report released to Congress, the Government Accountability Office (GAO) has found that the US government last year spent 75 percent of its technology budget to maintain aging computers where floppy disks are still used, including one system for US nuclear forces that is more than 50 years old. In a previous GAO report, the news is equally alarming as it impacts the healthcare of millions of American’s and could be the smoking gun in a study from the British Medical Journal citing medical errors as the third leading cause of death in the United States, after heart disease and cancer.
The GAO interoperability report, requested by Congressional leaders, reported on the status of efforts to develop infrastructure that could lead to nationwide interoperability of health information. The report described a variety of efforts being undertaken to facilitate interoperability, but most of the efforts remain “works in progress.” Moreover, in its report, the GAO identified five barriers to interoperability.
Insufficiencies in health data standards
Variation in state privacy rules
Difficulty in accurately matching all the right records to the right patient
The costs involved in achieving the goals
The need for governance and trust among entities to facilitate sharing health information
CMS Pushing for “Plug and Play” Interoperability Tools that Already Exist
Meanwhile in a meeting with the Massachusetts Medical Society, Andy Slavitt, Acting Administrator of the Centers for Medicare & Medicaid Services’ (CMS) acknowledges in the CMS interoperability effort “we are not sending a man to the moon.”
“We are actually expecting (healthcare) technology to do the things that it already does for us every day. So there must be other reasons why technology and information aren’t flowing in ways that match patient care,” Slavitt stated. “Partly, I believe some of the reasons are actually due to bad business practices. But, I think some of the technology will improve through the better use of standards and compliance. And I think we’ll make significant progress through the implementation of API’s in the next version of (Electronic Health Records) EHR’s which will spur innovation by allowing for plug and play capability. The private sector has to essentially change or evolve their business practices so that they don’t subvert this intent. If you are a customer of a piece of technology that doesn’t do what you want, it’s time to raise your voice.”
He claims that CMS has “very few higher priorities” other than interoperability. It is also interesting that two different government entities point their fingers at interoperability yet “plug and play” API solutions have been available through middleware integration for years, the same ones that are successfully used in the retail, banking and hospitality industries. As a sign of growing healthcare middleware popularity, Black Book Research, recently named the top ten middleware providers as Zoeticx, HealthMark, Arcadia Healthcare Solutions, Extension Healthcare, Solace Systems, Oracle, Catavolt, Microsoft, SAP and Kidozen.
Medical Errors Third Leading Cause of Death in US
The British Medical Journal recently reported that medical error is the third leading cause of death in the United States, after heart disease and cancer. As such, medical errors should be a top priority for research and resources, say authors Martin Makary, MD, MPH, professor of surgery, and research fellow Michael Daniel, from Johns Hopkins University School of Medicine. However, accurate, transparent information about errors is not captured on death certificates which are the documents the Center for Disease Control and Prevention (CDC) uses for ranking causes of death and setting health priorities. Death certificates depend on International Classification of Diseases (ICD) codes for cause of death, but causes such as human and EHR errors are not recorded on them.
According to the World Health Organization (WHO), 117 countries code their mortality statistics using the ICD system. The authors call for better reporting to help capture the scale of the problem and create strategies for reducing it. “Top-ranked causes of death as reported by the CDC form our country’s research funding and public health priorities,” says Makary in a press release. “Right now, cancer and heart disease get a ton of attention, but since medical errors don’t appear on the list, the problem doesn’t get the funding and attention it deserves. It boils down to people dying from the care that they receive rather than the disease for which they are seeking care.”
The Root Cause of Many Patient Errors
Better coding and reporting is a no-brainer and should be required to get to the bottom of the errors so they can be identified and resolved. However, in addition to not reporting the causes of death, there are other roadblocks leading to this frighteningly sad statistic such as lack of EHR interoperability. Unfortunately, the vast majority of medical devices, EHRs and other healthcare IT components lack interoperability, meaning a built-in or integrated platform that can exchange information across vendors, settings, and device types.
Various systems and equipment are typically purchased from different manufacturers. Each comes with its own proprietary interface technology like the days before the client and server ever met. Moreover, hospitals often must invest in separate systems to pull together all these disparate pieces of technology to feed data from bedside devices to EHR systems, data warehouses, and other applications that aid in clinical decision making, research and analytics. Many bedside devices, especially older ones, don’t even connect and require manual reading and data entry.
Healthcare providers are sometimes forced to mentally take notes on various pieces of information to draw conclusions. This is time consuming and error-prone. This cognitive load, especially in high stress situations, increases the risk of error such as accessing information on the wrong patient, performing the wrong action or placing the wrong order. Because information can be entered into various areas of the EHR, the possibility of duplicating or omitting information arises. Through the EHR, physicians can often be presented with a list of documentation located in different folders that can be many computer screens long and information can be missed.
The nation’s largest health systems employ thousands of people dedicated to dealing with “non-interoperability.” The abundance of proprietary protocols and interfaces that restrict healthcare data exchange takes a huge toll on productivity. In addition to EHR’s physical inability, tactics such as data blocking and hospital IT contracts that prevent data sharing by EHR vendors are also used to prevent interoperability. Healthcare overall has experienced negative productivity in this area over the past decade.
Guest post by Alexandra Roden, content editor, Connexica.
Just a few years ago, big data and the Internet of Things (IoT) were terms generally unheard of. This year they continue to revolutionize technology and the ways in which we acquire and process data, but what do they mean for the healthcare industry?
Xenon Health describe IoT as “a phenomenon through which the operational aspects of the physical world become increasingly integrated with digital platforms, enabling information to move seamlessly toward the computational resources that are able to make sense of it.” Essentially, IoT goes hand-in-hand with the mobile age and the diversity of data that is currently being retrieved from agile and mobile locations.
Big data is a related concept – it addresses the ever-increasing amounts of data that are created every second of every day and recognizes that these figures will only continue to grow. For example, in the “social media minute” every single minute there are 277,000 tweets are sent, Whatsapp users share 347,222 photos and Google receives more than 4,000,000 search queries. These figures are remarkable even for those of us caught up in the social media hype, and most shocking of all is the realization that the global Internet population now represents 2.4 billion people. That’s a lot of people creating a lot of data – the question now is how we can utilize this data in a meaningful way.
IoT has revolutionized many industries and will continue to do so in the foreseeable future, but what about healthcare? Organisations within this industry tend to adopt new technologies slowly, relying upon solid evidence and demonstrable impact and efficiency before committing to any such change. The shift towards IoT is, however, beginning to take place, and increasing amounts of available patient data are beginning to inform decision making processes within this sector.
By Darin M. Vercillo, MD, chief medical officer and co-founder, Central Logic.
Healthcare has been changing rapidly for the last 60 years and advances have now reached record speed, including in the realm of data intelligence. In trying to keep pace as well as to protect and advance their own businesses, many processes and systems have understandably been organized into silos. That era must come to a close.
Care coordination teams need rich collaboration of data and must now be connected. Hospitals, clinics, home health care workers, primary care physicians, vendors, and others must speak with each other, in the same language, and completely share patient data with an open, collaborative attitude. The industry is all abuzz with this uncharted territory called interoperability. It is clear that data warehouses, now bursting with valuable information, must be streamlined for three very simple reasons: patient safety, cost-effective healthcare delivery and overall population health management. A happy byproduct when data intelligence becomes actionable and systems work collaboratively is a financial benefit, but as a physician, I believe excellent patient care always wins the day, and should be the driving factor.
At the risk of this being looked at as “just a financial issue,” consider also that hospitalization is generally a marker for severe illness. Our goal is a healthier population. As we (patients and providers) succeed collectively with hospital treatment and post-acute care, then re-admissions will naturally decrease, and patients will live healthier, more satisfied, lives. Ultimately, this is our goal.
Appropriate, timely sharing of vital patient information will not only address re-admission rates that have clearly become egregious, but improved collaboration of data needs to happen to better inform decision making at the point of care. Without a keen eye to patient safety and success, it is too easy for details to slip through the cracks. All too often, history has demonstrated that hand-off points are the riskiest for failures in patient care.
Nearly everyone has a story where the current system has failed patients — just ask Jennifer Holmes, our CEO. Her father’s healthcare team made an error in medication that ultimately cost him his life. Similar medication errors and decreased duplicate testing can be avoided when a patient’s entire care coordination team has visibility into the data – all the data – to improve care efficiencies and diagnoses.
But all this sharing and playing nice in the sandbox is easier said than done.
The lack of EHR interoperability continues to pose a serious threat to healthcare initiatives, according to a recent report published by the American Hospital Association (AHA). The report discusses the various aspects of the healthcare industry and care delivery that are negatively impacted by a lack of interoperability.
The report notes that the exchange of health information is critical for the coordination of care. When patients receive care from multiple different providers, physicians should be able to securely send relevant patient information to the practicing physician. However, that tends not to be the case because EHR systems are not interoperable and cannot exchange information.
Last year, the ECRI Institute released a survey outlining the Top Ten Safety Concerns for Healthcare Organizations in 2015. The second highest concern was incorrect or missing data in EHRs and other health IT systems caused by interoperability. For the second year in a row, EHR data is identified as a concern.
The Partnership for Health IT Patient Safety, a branch of the ECRI Institute, has released safe practice recommendations for using the copy and paste function in EHRs that can adversely affect patient safety, such as the use of copy and paste that can overpopulate data and make relevant information difficult to locate, according to the partnership’s announcement.
Meanwhile, a survey of 68 accountable care organizations conducted by Premier, Inc. and the eHealth Initiative found that despite steep investments in health information technology, they still face interoperability challenges that make it difficult to integrate data across the healthcare continuum.
The survey found that integrating data from out-of-network providers was the top HIT challenge for ACOs, cited by almost 80 percent of respondents. Nearly 70 percent reported high levels of difficulty integrating data from specialists, particularly those that are out-of-network.
User Frustration Over Lack of HIE and Interoperability Standards
The Office of the National Coordinator for Health Information Technology (ONC) is once again asking the healthcare community for its thoughts on establishing metrics to determine if or to the extent to which electronic health records are interoperable. The push to achieve interoperability is in response to last year’s mandate by Congress, contained in the Medicare Access and CHIP Reauthorization Act (MACRA). Among provisions of that law is a requirement to achieve “widespread” interoperability of health information by the end of 2018.
When it comes to how Health Information Exchanges (HIEs) handle the challenges associated with interoperability, a recent Government Accountability Office (GAO) report cites the following barriers–insufficient health data standards, variations in state privacy rules and difficulty in accurately matching the right records to the right patient. In addition, the costs and resources necessary to achieve interoperability goals, and the need for governance and trust among entities to facilitate sharing health information.
In its annual interoperability survey of hospital and health system executives, physician administrators and payer organization IT leaders released in April 2016, Black Book Research found growing HIE user frustration over the lack of standardization and readiness of unprepared providers and payers.
Of hospitals and hospital systems, 63 percent report they are in the active stages of replacing their current HIE system while nearly 94 percent of payers surveyed intend to totally abandon their involvement with public HIEs. Focused, private HIEs also mitigate the absence of a reliable Master Patient Index (MPI) and the continued lack of trust in the accuracy of current records exchange.
Public HIEs and EHR-dependent HIEs were viewed by 79 percent of providers as disenfranchising payers from data exchange efforts and did not see payers as partners because of their own distinct data needs and revenue models. Progressive payers are moving rapidly into the pay-for-value new world order and require extensive data analytics capabilities and interoperability to launch accountable care initiatives.
Those looking at touted standards such as Fast Healthcare Interoperability Resources (FIHR) point out that it is only capable of connecting one medical facility to another and requiring specific end point interfaces to even do that. For every additional facility, a customized interface must be built. At the end of the day, FIHR is really a point-to-point customized interface requiring extra steps and ties developers to specific hospitals or EHRs and without universal access.
“Progressive FHIR standards can allow EHRs to talk to other EHRs should standard definitions develop on enough actionable data points as we enter a hectic period of HIE replacements, centering on the capabilities of open network alliances, mobile EHR, middleware and population health analytics as possible answers to standard HIE,” said Doug Brown, managing partner of Black Book.
Guest post by Richard A. Royer, MBA, chief executive officer, Primaris.
It has been several years since Medicare began introducing payment changes aimed at driving the healthcare industry away from volume-based payments and toward value-based reimbursements. One of the main purposes of the payment system’s overhaul is to improve the quality of care that healthcare providers deliver to patients. Of course, the other main goal is to keep costs in check. In simple terms, the shift to value-based incentives rewards providers that deliver on cost, quality and patient outcome measures. What many providers have learned along the way is that technology plays an important role in the transition to value-based care, and meaningful use of electronic health records is necessary for success under value-based incentive programs.
Value-Based Payment Basics
For healthcare providers that are working to adapt to new payment models and are just beginning to make adjustments, understanding the basics of value-based care is the first step to success. Some of the key points healthcare providers need to recognize about value-based reimbursements are:
The value model rewards performance. That can mean a number of different things, for example, achieving high quality and patient satisfaction scores or making improvements to care over time. The point is, providers must focus on meeting certain standards for care and cost in order to be eligible to earn financial incentives and to avoid penalties.
Value-based care models are extremely data driven. Providers need to measure and report performance outcomes in order to assess their efforts internally, and also so they can earn reimbursements from external payers. As a result, healthcare providers need to continuously measure and analyze patient data, not just collect it.
Collaboration is an important success factor under value. Patients – particularly those with chronic health conditions – receive care from multiple providers as they move across the care continuum. To ensure that treatments, medications, and care plans are safe and effective, and that patient outcomes (which impact reimbursements) are the best they can be, providers need to communicate with each other and work to coordinate care. Value-based programs demand coordinated care.
Guest post by Michael Leonard, director of product management, healthcare, Commvault.
Once a year, the healthcare community gathers to discuss the hottest healthcare trends. This year, the event took place in Sin City, and the turnout was staggering. Topics of choice at the show ranged from EHR best practices to the rising need for telehealth services.
Now that I’ve had a chance to step back and digest, there are a few key moments that jumped out from the event. Here are my top two:
The HIMSS survey showed healthcare organizations are ready for telehealth.
During the show, HIMSS released a survey that had some exciting results around connected technology in the healthcare field. The results showed that 52 percent of hospitals are currently using three or more connected health technologies. Technologies being used by that group that stood out to me include mobile optimized patient portals (58 percent), remote patient monitoring (37 percent) and patient generated health data (32 percent). It’s fascinating to see these results, and important for healthcare and health IT professionals to know that the telehealth wave is here to stay.
The U.S. Department of Health and Human Services’ (HHS) made a key interoperability announcement.
At the show, the HHS Secretary Sylvia M. Burwell made a major announcement around interoperability that was backed by the majority of the top electronic health record (EHR) vendors and is supported by many of the leading providers. This news will enhance healthcare services and allow doctors and patients to make better informed decisions. It certainly has the potential to catapult the industry forward, allowing healthcare partners to increase accessibility by improving their clinical data management solutions.
As always, the conversation at HIMSS was engaging and educational and I left with some great takeaways and predictions for the future of health IT including:
Guest post by Jeff Kaplan, chief strategy officer, ZirMed.
The 40,000 healthcare and healthcare IT professionals who gathered at the Sands Expo in Vegas brought a different vibe for HIMSS 2016. The halls buzzed with activity and an overall optimism that belied any of the potential causes for uncertainty—politics, a down stock market, increases in uncompensated care, the movement toward fee-for-value, or the staggering shift in patient responsibility.
For those who attended HIMSS 2015 in Chicago, the difference was visible in vendor messaging and audible in conversations during the conference. Among all attendees the optimism seemed well founded, grounded in reality. We all see significant opportunity to drive improvement in healthcare for our generation and generations to come. That’s why we came to HIMSS – we’ve placed our bets.
In that spirit, let’s talk about where healthcare is doubling down, where it’s hit a perfect blackjack, and which trends pushed as providers look for the next deal.
Double Down – Data Interoperability
Out of the gate at HIMSS 2016, there was increased focus and emphasis on the importance of data interoperability and integration. From booth signage to the increase in dedicated vendors to industry veterans evangelizing on the topic, you couldn’t miss the tells from all players—everyone wants to show they have a strong hand when it comes to interoperability. Epic’s Judy Faulkner made a play that Epic wasn’t just the leader of the interoperability movement – they were in fact the originator (see her interview with Healthcare IT Newshere).
Of course, wander off into other parts of the exhibition hall and it wasn’t long before you heard the all-too-familiar complaints about closed-system platforms – that they limit innovation by outside companies and technologists who can build applications to add additional value. In the era of Salesforce.com and other open platform successes, many HIMSS attendees spoke of their hope that companies like Cerner and Epic will follow suit.
Blackjack – Data Analytics
Over the last year vendors heard providers loud and clear – healthcare providers need hard ROI on any new initiatives, especially as many have EHR/HCIS sunk costs in the tens of millions of dollars. They need a sure thing—and the changes evident at HIMSS 2016 reflected that shift. Buzzwords like “Big Data” thankfully went to the wayside and were replaced with meaningful discussion around data analytics and data warehousing. Providers know they’ve amassed a wealth of clinical and financial data—now they’re looking for ways to increase the quality of patient care while driving down costs.
With the yearly bluster and promise of HIMSS, I still find there have been few strides in solving interoperability. Many speakers will extol the next big thing in healthcare system connectivity and large EHR vendors will swear their size fits all and with the wave of video demo, interoperability is declared cured. Long live proprietary solutions, down with system integration and collaboration. Healthcare IT, reborn into the latest vendor initiative, costing billions of dollars and who knows how many thousands of lives.
Physicians’ satisfaction with electronic health record (EHR) systems has declined by nearly 30 percentage points over the last five years, according to a 2015 survey of 940 physicians conducted by the American Medical Association (AMA) and American EHR Partners. The survey found 34 percent of respondents said they were satisfied or very satisfied with their EHR systems, compared with 61 percent of respondents in a similar survey conducted five years ago.
Specifically, the survey found:
42 percent of respondents described their EHR system’s ability to improve efficiency as difficult or very difficult;
43 percent of respondents said they were still addressing productivity challenges related to their EHR system;
54 percent of respondents said their EHR system increased total operating costs; and
72 percent of respondents described their EHR system’s ability to decrease workload as difficult or very difficult.
Whether in the presidential election campaign or at HIMSS, outside of the convention center hype, our abilities are confined by real world facts. Widespread implementation of EHRs have been driven by physician and hospital incentives from the HITECH Act with the laudable goals of improving quality, reducing costs, and engaging patients in their healthcare decisions. All of these goals are dependent on readily available access to patient information.
Whether the access is required by a health professional or a computers’ algorithm generating alerts concerning data, potential adverse events, medication interactions or routine health screenings, healthcare systems have been designed to connect various health data stores. The design and connection of various databases can become the limiting factor for patient safety, efficiency and user experiences in EHR systems.
Healthcare, and the increasing amount of data being collected to manage the individual, as well as patient populations, is a complex and evolving specialty of medicine. The health information systems used to manage the flow of patient data adds additional complexity with no one system or implementation being the single best solution for any given physician or hospital. Even within the same EHR, implementation decisions impact how healthcare professional workflow and care delivery are restructured to meet the constraints and demands of these data systems.
Physicians and nurses have long uncovered the limitations and barriers EHRs have brought to the trenches of clinical care. Cumbersome interfaces, limited choices for data entry and implementation decisions have increased clinical workloads and added numerous additional warnings which can lead to alert fatigue. Concerns have also been raised for patient safety when critical patient information cannot be located in a timely fashion.
Solving these challenges and developing expansive solutions to improve healthcare delivery, quality and efficiency depends on accessing and connecting data that resides in numerous, often disconnected health data systems located within a single office or spanning across geographically distributed care locations including patients’ homes. With changes in reimbursement from a pay for procedure to a pay for performance model, an understanding of technical solutions and their implementation impacts quality, finances, engagement and patient satisfaction.
Guest post by Adam Hawkins, vice president client services, CynergisTek.
HIMSS 2016 is right around the corner, and I’m sure everyone is excited about the prospects of conferencing in Las Vegas. This location certainly has a lot going on to keep everyone busy, on and off the exhibit floor. There should be many new healthcare technology players to see and learn about, and it is always interesting to visit the innovation area. Hopefully, we’ll get to hear what folks like KLAS, HIMSS Analytics and other research organizations are working on in 2016 as well.
For instance, KLAS is continuing its work toward including security vendors as its own category, and has a new study underway to look at service providers in this space. That study won’t be completed in time for HIMSS, but they should be able to preview what they hope to accomplish with the study and what its report will include. I think it will be an important read for everyone in our industry.
Interoperability is a huge area of concentration in healthcare at the moment with the Office of the National Coordinator, Health & Human Services and HIMSS all very much involved in this discussion. There are sure to be several presentations on this and related topics. Hopefully we will hear how security and privacy will be addressed, as they are critical components of making many of our health initiatives successful and rely heavily on interoperability for success.
Guest post by Linda Lockwood, solutions director and service line owner, health solutions, CTG.
With HIMSS 2016 fast approaching, the hunt for the perfect Population Health tool will be underway. Whether you’re a HIMSS veteran or a first-time attendee, expect to be caught in a jungle of vendors, each promising the latest and greatest Population Health tools.
HIMSS seems to grow each year, and with so many vendors, solutions and offerings, and the buzz happening during the event, it can be a challenge to carefully evaluate Population Health tools to help inform a decision.
HIMSS can make you excited for the future of your organization, but can also be overwhelming with so many Population Health options to consider. These six tips can help you separate fact from fiction and select a tool that best meets the population health needs of your organization:
Identify organizational goals for population health and match your tool choice to those goals: It’s important to understand what your organizational goals are, as they will drive the selection of tools. If you have not entered into risk bearing agreements, but want to be prepared, perhaps you may want to start off with a tool that supports development of registries and profiles physician performance. You will also want to identify your high risk, high cost patients, and be sure you have the ability to track this performance over time. This information may be available from your financial systems, but you also will need to have the ability to drill down to the device, and supply level—as well as use of medications and supplies including blood products—to identify opportunities for improvement.
How does joining an ACO impact your decision? If you have plans to join an ACO, your needs may include the ability to perform Care Management and Care Coordination and Patient Engagement. You will want to be sure that there is interoperability between the hospital, physician offices and care managers as well as the payers. Reporting becomes critical with an ACO as certain metrics must be reported on a regular basis. As you evaluate tools, ask if they have pre-build reports that include some of the standard measures that a MSSP requires, as well as CMS.
Think about mergers and acquisitions: If you are in the process of a merger or acquiring physicians, you must ensure whatever tool you include has the ability to aggregate data from multiple EHRs and formulate a plan to support interoperability for sharing and exchanging key data. If you are self insured, your organization will have access to data about your population. If you are focusing on wellness and prevention, you will want tools to support patient engagement, health and wellness. Alternately, if have high risk patients, you require Population Health tools to support care coordination, outreach, pharmacy and lab adherence and wellness reminders.
Make data quality a priority: The ability to have accurate, reliable data is crucial with any Population Health or reporting tool. If a data governance system is in place, it’s important to understand what source data you will need to populate the tool. Be sure you know where key data is entered in the system and the common values for that data. In tandem with this, the organization should identify data stewards and business owners. Data governance must have organization-wide commitment, and business owners who are actively engaged.