With the yearly bluster and promise of HIMSS, I still find there have been few strides in solving interoperability. Many speakers will extol the next big thing in healthcare system connectivity and large EHR vendors will swear their size fits all and with the wave of video demo, interoperability is declared cured. Long live proprietary solutions, down with system integration and collaboration. Healthcare IT, reborn into the latest vendor initiative, costing billions of dollars and who knows how many thousands of lives.
Physicians’ satisfaction with electronic health record (EHR) systems has declined by nearly 30 percentage points over the last five years, according to a 2015 survey of 940 physicians conducted by the American Medical Association (AMA) and American EHR Partners. The survey found 34 percent of respondents said they were satisfied or very satisfied with their EHR systems, compared with 61 percent of respondents in a similar survey conducted five years ago.
Specifically, the survey found:
42 percent of respondents described their EHR system’s ability to improve efficiency as difficult or very difficult;
43 percent of respondents said they were still addressing productivity challenges related to their EHR system;
54 percent of respondents said their EHR system increased total operating costs; and
72 percent of respondents described their EHR system’s ability to decrease workload as difficult or very difficult.
Whether in the presidential election campaign or at HIMSS, outside of the convention center hype, our abilities are confined by real world facts. Widespread implementation of EHRs have been driven by physician and hospital incentives from the HITECH Act with the laudable goals of improving quality, reducing costs, and engaging patients in their healthcare decisions. All of these goals are dependent on readily available access to patient information.
Whether the access is required by a health professional or a computers’ algorithm generating alerts concerning data, potential adverse events, medication interactions or routine health screenings, healthcare systems have been designed to connect various health data stores. The design and connection of various databases can become the limiting factor for patient safety, efficiency and user experiences in EHR systems.
Healthcare, and the increasing amount of data being collected to manage the individual, as well as patient populations, is a complex and evolving specialty of medicine. The health information systems used to manage the flow of patient data adds additional complexity with no one system or implementation being the single best solution for any given physician or hospital. Even within the same EHR, implementation decisions impact how healthcare professional workflow and care delivery are restructured to meet the constraints and demands of these data systems.
Physicians and nurses have long uncovered the limitations and barriers EHRs have brought to the trenches of clinical care. Cumbersome interfaces, limited choices for data entry and implementation decisions have increased clinical workloads and added numerous additional warnings which can lead to alert fatigue. Concerns have also been raised for patient safety when critical patient information cannot be located in a timely fashion.
Solving these challenges and developing expansive solutions to improve healthcare delivery, quality and efficiency depends on accessing and connecting data that resides in numerous, often disconnected health data systems located within a single office or spanning across geographically distributed care locations including patients’ homes. With changes in reimbursement from a pay for procedure to a pay for performance model, an understanding of technical solutions and their implementation impacts quality, finances, engagement and patient satisfaction.
Guest post by Adam Hawkins, vice president client services, CynergisTek.
HIMSS 2016 is right around the corner, and I’m sure everyone is excited about the prospects of conferencing in Las Vegas. This location certainly has a lot going on to keep everyone busy, on and off the exhibit floor. There should be many new healthcare technology players to see and learn about, and it is always interesting to visit the innovation area. Hopefully, we’ll get to hear what folks like KLAS, HIMSS Analytics and other research organizations are working on in 2016 as well.
For instance, KLAS is continuing its work toward including security vendors as its own category, and has a new study underway to look at service providers in this space. That study won’t be completed in time for HIMSS, but they should be able to preview what they hope to accomplish with the study and what its report will include. I think it will be an important read for everyone in our industry.
Interoperability is a huge area of concentration in healthcare at the moment with the Office of the National Coordinator, Health & Human Services and HIMSS all very much involved in this discussion. There are sure to be several presentations on this and related topics. Hopefully we will hear how security and privacy will be addressed, as they are critical components of making many of our health initiatives successful and rely heavily on interoperability for success.
Guest post by Linda Lockwood, solutions director and service line owner, health solutions, CTG.
With HIMSS 2016 fast approaching, the hunt for the perfect Population Health tool will be underway. Whether you’re a HIMSS veteran or a first-time attendee, expect to be caught in a jungle of vendors, each promising the latest and greatest Population Health tools.
HIMSS seems to grow each year, and with so many vendors, solutions and offerings, and the buzz happening during the event, it can be a challenge to carefully evaluate Population Health tools to help inform a decision.
HIMSS can make you excited for the future of your organization, but can also be overwhelming with so many Population Health options to consider. These six tips can help you separate fact from fiction and select a tool that best meets the population health needs of your organization:
Identify organizational goals for population health and match your tool choice to those goals: It’s important to understand what your organizational goals are, as they will drive the selection of tools. If you have not entered into risk bearing agreements, but want to be prepared, perhaps you may want to start off with a tool that supports development of registries and profiles physician performance. You will also want to identify your high risk, high cost patients, and be sure you have the ability to track this performance over time. This information may be available from your financial systems, but you also will need to have the ability to drill down to the device, and supply level—as well as use of medications and supplies including blood products—to identify opportunities for improvement.
How does joining an ACO impact your decision? If you have plans to join an ACO, your needs may include the ability to perform Care Management and Care Coordination and Patient Engagement. You will want to be sure that there is interoperability between the hospital, physician offices and care managers as well as the payers. Reporting becomes critical with an ACO as certain metrics must be reported on a regular basis. As you evaluate tools, ask if they have pre-build reports that include some of the standard measures that a MSSP requires, as well as CMS.
Think about mergers and acquisitions: If you are in the process of a merger or acquiring physicians, you must ensure whatever tool you include has the ability to aggregate data from multiple EHRs and formulate a plan to support interoperability for sharing and exchanging key data. If you are self insured, your organization will have access to data about your population. If you are focusing on wellness and prevention, you will want tools to support patient engagement, health and wellness. Alternately, if have high risk patients, you require Population Health tools to support care coordination, outreach, pharmacy and lab adherence and wellness reminders.
Make data quality a priority: The ability to have accurate, reliable data is crucial with any Population Health or reporting tool. If a data governance system is in place, it’s important to understand what source data you will need to populate the tool. Be sure you know where key data is entered in the system and the common values for that data. In tandem with this, the organization should identify data stewards and business owners. Data governance must have organization-wide commitment, and business owners who are actively engaged.
Guest post by Drew Ivan, director of business technology, Orion Health.
With such an enormous cross-section of the healthcare industry in attendance, the HIMSS Conference and Exhibition represents a comprehensive snapshot of the state of the healthcare industry and a perfect trendspotting opportunity. Here’s a preview of what I expect will be this year’s conference highlights.
Care coordination and population health and process improvement, workflow and change management are tied for the most popular category, with 29 educational sessions focused on each.
Representing 22 percent of the total number of sessions, this is clearly a focus area for the year’s conference, and it’s easy to see why. Changes in healthcare payment models are now well underway, and they are impacting payer and provider operations where healthcare is delivered, managed and documented.
Providers and payers alike are seeking information about how best to operationalize business processes and provide high quality care under new payment models, but it may be even more interesting to visit the Exhibition Hall to see what innovations vendors are bringing to the market to meet these needs.
Another topic related to changes in healthcare delivery is clinical informatics and clinician engagement, which is all about how new technologies, such as big data and precision medicine, can impact care decisions. The ability to make data-driven clinical decisions is one of the many dividends of widely adopted electronic health records. This is likely to be an important area for many years to come.
With 100 million medical records hacked last year, privacy and security is a hot topic at this year’s conference. The number of educational sessions in this category nearly doubled from 13 last year to 25 this year.
While preventing unauthorized access to records is the top priority, security will be a simpler problem to solve than privacy. As more sources of clinical data go from paper to electronic systems and more types of users have legitimate access to patient data, the problem of providing appropriate, fine-grained access in accordance with patient preferences, clinical settings and laws that differ across jurisdictions becomes very difficult to untangle.
Privacy and security concerns will need to be addressed with a combination of open standards and vendor products that implement them. Technologies from other industries, like banking, are likely to start making their way into healthcare.
This year, health information exchange (HIE) and interoperability educational sessions are combined into a single category, reflecting the fact that interoperability within a single institution is, at this point, more or less a solved problem. The next frontier is to enable interoperability across institutions to support improved transitions of care.
HIEs have a role to play when it comes to moving data between organizations; however, many HIEs are struggling or disappearing because of sustainability challenges. This year’s conference will provide an opportunity to learn best practices from the most successful HIEs. It will also be interesting to see what strategies HIE vendors will pursue as their customer base consolidates. In the Orion Health booth alone, we will have executives from HIEs talking about these same issues.
Over the past few years, healthcare technology has seen many advances. We’ve achieved mass-market adoption of EHRs, many organizations are making meaningful progress on data aggregation and warehousing information from multiple diverse systems, and wearables and other sensors show much potential to unlock personal information about each patient. The pace of change in healthcare is quickening, with each new technology or initiative sending off a chain of reactions across the entire ecosystem, ultimately improving patient care.
I see three trends driving the industry toward change:
Analytics will help predict population heath management
One of the persistent industry challenges is the “datafication” of healthcare. We’re amassing more and more data now than ever before. And new sources (like wearable devices) and new health factors (like DNA) will contribute even more. This data explosion is putting increased pressure on healthcare organizations to effectively make this data useful by delivering efficiency gains, improve quality of care and reduce overall healthcare costs.
Navigating this digitized healthcare environment will require increasingly sophisticated tools to help handle the influx of data and make the overload of healthcare information useful. In 2016, the industry will begin to take concrete steps to transition to a world where every clinician will see a snapshot of each of their patients to help them synthesize the critical clinical information they need to make a care decision. Moreover, hyper-complex algorithms will allow providers not only to know their patients, but to accurately predict their healthcare trajectories. By giving providers insights into how each patient is trending, clinicians will be able to make better-informed, precise decisions in real-time.
Consolidation leads to new healthcare models, improved outcomes
New models for effective population health management continue to drive change across healthcare systems. These models incentivize stakeholders to optimize costs, identify organizational efficiencies and improve decision-making processes to deliver better care at a lower cost through an emphasis on care coordination and collaboration.
As president and COO of a leading electronic health record (EHR) and practice management (PM) provider, part of my job is to be in constant communication with providers about health IT. They tell me and my team what works for them and what doesn’t work; what brings joy to their practice and what keeps them up at night. All this insight helps polish my crystal ball, making it clear what we can expect to see in 2016:
EHR system will pivot from regulatory compliance to physician productivity. EHRs are generally blamed for fueling the professional dissatisfaction of physician. A few software vendors are looking at the problem-oriented medical record (POMR), a more intuitive approach that works similarly to the way a doctor thinks. It organizes clinical records and practice workflows around specific patient problems, making it faster and more satisfying for physicians to use.
The problem list not only delivers a “table of contents” to clinically relevant issues, but also gives a provider a longitudinal view of a patient’s healthcare over time. This intuitive method of information organization makes it easier for provider and patient to set the agenda at the start of the exam. During the exam, the POMR supports the nonlinear nature of a patient encounter.
The POMR also helps reduce cognitive overload, which can lead to medical mistakes such as misdiagnosis and other potentially life-threatening errors. Providers can see “bits” of data like lab results associated with a specific problem, thus easing the number of mental connections required to make sound medical decisions.
Chronic care management (CCM) will grow quickly because it makes sense for both patients and providers. Our healthcare system is changing to address the needs of an aging population with chronic illnesses like hypertension, diabetes, heart disease, and more. To promote the effective care coordination and management of patients with multiple chronic illnesses, the Centers for Medicare and Medicaid Services (CMS) introduced CPT code 99490. This code reimburses providers for remote, inter-visit outreach, such as telephone conversations, medication reconciliation, and coordination among caregivers.
The reimbursement for CCM services is an average of $42 per month for Medicare beneficiaries. New levels of technology integration will enable clinicians to complete CCM reporting of remote care from inside their EHR system.
Guest post by Scott Jordan, co-founder and chief innovation officer, Central Logic
Gone are the days when IT department gurus ran lengthy reports, sifting through numbers and analyzing data until the wee hours of the morning, all in the quest of fancy profit center reports to impress the C-Suite. Especially in hospital settings where lives are on the line, data in 2016 must be delivered in real time, and even more importantly, must be relevant, connected, and able to be understood, interpreted and acted upon immediately by a myriad of users.
Data That’s Right
Today, having the right data intelligence that is actionable is paramount. It’s no longer enough for analytics to only interpret information from the past to make the right predictions and decisions. With the changing healthcare landscape, it’s increasingly important that data intelligence must also be relevant and the tools agile enough to provide an accurate assessment of current events and reliably point to process and behavior changes for improved outcomes … in real time.
All tall order for any IT solution, much less one in healthcare where robust security parameters, patient satisfaction concerns and HIPAA regulations are just the tip of the iceberg a health system must consider.
Data That’s Connected
The good news is data technology tools now exist that offer interoperability features – from inside and outside a hospital’s four walls – this allows providers to exchange and process electronic health information easily, quickly, intuitively and accurately, with reliably replicable solutions.
When users can see the full complement of a patient’s health record, they can more accurately improve care coordination and save lives. Specifically, connected patient records can:
avoid duplication of diagnostic procedures,
properly evaluate test results and treatment outcomes, regardless of where care was delivered,
share basic patient data during referrals and get information after specialist visits,
view medications, regardless of where prescribed, avoiding drug interactions, medication abuse, etc., and
view allergy and pre-existing condition information, especially valuable to Emergency Department transfers.
Guest post by Susan Kanvik, healthcare senior director, Point B.
The goals of data governance have long been clear outside of the healthcare industry. Organizations want to enable better decision-making, reduce operational friction, protect the needs of data stakeholders, train management and staff to adopt common approaches to data issues, build standard, repeatable processes, reduce costs and increase effectiveness through coordination of efforts and ensure transparency of processes.
That’s a tall order. And one that’s coming to healthcare this year for several reasons:
Data regulatory mandates will increase the need for data transparency. For example, the requirement for distribution of clinical studies data for public consumption requires pharma and biotech companies to publish clinical study information publicly. This public exposure shines a light on the data, requiring it to be accurate and with consistent definitions.
As patients move toward becoming data consumers, the need for accurate data with consistent definitions becomes even more important. Higher deductible coverage is driving patients to research and buy healthcare services based on published data regarding a healthcare organization’s cost and quality. Additionally, patients are managing their health correspondence via patient portals which, again, requires absolute precision with data accuracy and definition.
The migration to enterprise data solutions and less clear ownership of data is becoming commonplace. While many healthcare organizations were well on their way to implementing enterprise EHRs, moving away from siloed applications, the health IT incentives included in the Patient Protection and Affordable Care Act (PPACA) have pushed a much higher number of healthcare organizations to enterprise solutions. With an enterprise EHR clinical solution, where data is less siloed, data “ownership” is less clear. In the not so distant past, individual departments would manage their own data. The move toward the adoption of enterprise systems requires enterprise level governance, which will increase in importance during the coming year and beyond.
Reporting requirements in healthcare have increased in scope and complexity. Having a data dictionary, a common data governance tool, facilitates common definitions, which is critical in supporting reporting requirements.
Sharing of data across healthcare organizations heightens the need for increased IT security. As healthcare organizations have grown through acquisitions and partnerships, many are finding it hard to share data within their organizations due to a lack of interoperability across different EHRs. This creates a challenge to share data beyond their organization, and healthcare organizations will need to carefully manage both how data is shared and the increased security risks that come with that sharing. Sharing of data and address the increased security risks that accompany it.
Last year, 2015, was a year of buildup, anticipation, and finally some bold moves to propel healthcare technologies forward, specifically regarding interoperability of data. The Office of the National Coordination, under the auspices of the department of Health and Human Services, released the long-awaited and much-debated meaningful use Stage 3 requirements in October. All the players in the health tech space were awaiting the final verdict on how Application Programming Interface (API) technology was placed into the regulations, and the wait was worth it, regardless of which side of the fence you were on. Before we get into the predictions, though, a little background knowledge about these technologies, and their benefits, will be helpful.
An API is a programmatic method that allows for the exchange of data with an application. Modern APIs are typically web-based and usually take advantage of XML or JSON formats. If you are reading this article, you almost inevitably have used apps that exchange data using an API. For example, an application for your smartphone that collects data from your Facebook account will use an API to obtain this data. Weather apps on phones also utilize an API to collect data.
Next let’s take a look at the history of interoperability of healthcare data. HL7 2.x is a long standing method to exchange healthcare data in a transactional model. The system is based on TCP/IP principles and typically operates with Lower Layer Protocol (LLP) which allows for rapid communication of small delimited messages. The standard defines both the communications protocol and the message content format. No doubt about it, HL7 2.x is incredibly effective for transactional processing of data, but it has been limited in two key areas:
A pioneering developer of a successful HL7 interface engine once said: “Once you have developed one HL7 interface … you have developed one HL7 interface.” The standard exists, but there is nowhere near enough conformity to allow this to be plug-and-play. For example, a patient’s ethnicity is supposed to be in a specific location and there is a defined industry standard list of values (code set) to represent ethnicity. In reality, the ethnicity field is not always populated and if it is, it rarely follows the defined code set.
HL7 is an unsolicited push method, which means when a connection is made, messages simply flow from one system to another. If you are attempting to build a collection of cumulative data over time, this is a mostly sufficient method, but what you cannot do is ask a question and receive a response. Although some query/response methods have existed for years, their adoption has been very sparse in the industry.
2016: Year of the Healthcare API
If you are a physician with an electronic health record (EHR) system and you accept Medicare patients, you likely have gone through the process of becoming meaningful use (MU) certified, which means you have purchased an EHR software solution certified by the ONC. This EHR must follow guidelines of technical features, and physicians must ensure they utilize those features in some manner. In October 2015, the ONC released MU Stage 3 criteria (optional requirement in 2017, mandatory in 2018) which includes this game changer: A patient has a right to their electronic health information via an API.
Guest post by Ben Weber, managing director, Greythorn.
This is the time of year when people are looking into their crystal ball, and telling all of us what they see happening in the next 12 months. Some of these predictions will be wild (aliens will cure cancer!) and some will be obvious (more health apps in 2016!). But how many will be helpful?
As I gaze into my own crystal ball, I have to admit I’m also peeking at my email (I like to multi-task). I can’t really say if it’s inspired by the swirling lights of the magic orb on my desk, or if it’s because of the inquiries from clients, messages from my management team and RFPs from various hospital systems … but I also have a prediction for the New Year: 2016 will be the year of migration for Epic and Cerner consultants.
The United States healthcare industry has made great progress in EHR implementation—to the point where implementation is no longer the primary conversation we’re having. Now we’re discussing interoperability, if we’re using ICD-10 codes correctly, how and if we should integrate the data collected from wearable fitness technology, and more. Those discussions—and the decisions made as a result—will continue to require human intelligence and power, but in 2016 there will be a decreased demand for consultants on these projects. Healthcare IT professionals who have grown accustomed to this kind of work will either have to settle into full-time employment—or turn their nomadic hearts north to Canada.
Our neighbors on the other side of the 49th parallel are ramping up their EHR implementations, which is good news for consultants interested in using their passports. Implementations in the US are slowing down, and while there is still work available, it is not as constant and may not command the same hourly rates as in years past. Meanwhile, several leading Canadian healthcare IT organizations have already warned of a looming talent shortage in their country (source), the effects of which are beginning to be felt.
Epic and Cerner specialists are particularly in demand, as there is a dearth of experienced talent. Out of the Canadian healthcare IT professionals who have worked with an EMR, 28 percent report familiarity with MEDITECH, 13 percent with Cerner, and 7 percent with McKesson. Only 4 percent have worked with Epic, according to the 2015 Canadian Healthcare HI & IT Market Report.