Interoperability, as it was envisioned, should be built on transparency and connectivity, allowing a patient’s critical health information to be easily accessible, regardless of where treatment is being administered. By creating an infrastructure that supports the sharing of patient data along the care continuum, hospitals, skilled nursing facilities (SNF) and long-term post-acute care (LTPAC) facilities can offer the best care possible. As a result, organizations that participate in interoperability best practices are positioned to become preferred providers.
Unfortunately, interoperability is still a work in progress for many organizations. While more than 95 percent of hospitals and 90 percent of office-based physicians are now utilizing electronic health record (EHR) platforms, many struggle with — or have reservations around — sharing information outside of their facility. As such, silos represent a great barrier to realizing a fully implemented state of interoperability.
The current data gap can drastically impact care. For example, a patient experiences a serious medical incident — such as a fall or stroke — and arrives at the hospital where staff may not have access to existing patient data which could inform the best delivery of care. Or perhaps they’re able to access that data, but not right away. Care is now delayed, which can be additionally concerning depending on the time-sensitivity of the patient’s condition.
Taking this example a step further, let’s explore what happens after care at the hospital has concluded. The patient requires rehabilitation, and a continuation of care document (CCD) is issued to a post-acute care facility. From there, the patient’s information is transferred by less-than-foolproof methods such as fax, for example. A glitch as simple as a jammed paper feed could prevent critical information from reaching the appropriate caregiver.
As value-based care and payment-care models are moving toward the forefront, blind handoffs of patient information are no longer viable, as they drastically increase the financial risks hospitals and payer groups are subject to — not to mention the clear detriment the system has on delivery of care.
Closing the gap
The larger question is how does the industry get from Point A to Point B? The easy answer is to liberate the data through a cloud-based infrastructure that supports an efficient, easy-to-access data exchange between all caregivers. An integrated solution would connect stakeholders across the care continuum, providing accurate insights when needed, eliminating data silos between care partners, and enabling more confident decision-making.
These systems would promote:
Optimized transitions: Data needs to travel with the patient — or before movement — discretely across all systems.
Patient visibility: Data should reflect the most current ADT information, identifying and sharing where a patient is and from where they’ve been discharged.
Central view of LTPAC patients: This facility-agnostic feature should offer automated updates of a patient’s functional progress.
Ongoing status and monitoring: Maintaining continued care is facilitated through alerts and notifications to caregivers regarding any change to their status or well-being and meaningful feedback on care pathway progress.
Facility performance: Beyond understanding a patient’s status, it’s also helpful to understand how facilities in and out of their PPN have performed.
The concept of interoperability, in some ways, seems contradictory to traditional best practices. Healthcare organizations are charged with protecting patient data at all costs, and the idea of sharing data in a way that opens access to a wider group of stakeholders could give pause. Regulatory infractions for data loss in the healthcare industry can be steep, and the number of well-publicized data breaches in recent years reinforces how valuable health records are to both the organizations who keep them and those who try to steal them.
So, it should go without saying that an EHR “superhighway” must be developed with security in its DNA, taking stringent regulatory requirements into account. The good news is that the newest breed of information exchange platforms is being built with security roles in mind, drastically reducing the possibility of data loss.
Decreasing inpatient admission volumes, shifts in the re-imbursement mix from higher-margin commercial payers to lower-margin public payers, and pressures resulting from value-based care have been solid trends during the past several years. Thus, it was not surprising that a Moody’s Investors Service report released in August portrayed the current condition of finances for not-for-profit hospitals as troubling.
According to Moody’s, the median annual expense growth rate slowed from 7.1 percent in 2016 to 5.7 percent in 2017 because of hospitals’ continued control of labor and supply costs. But annual revenue growth fell faster, from 6.1 percent in 2016 to 4.6 percent in 2017, the second straight year that expense growth exceeded revenue growth, a trend that is expected to continue through 2019. Moody’s concluded that not-for-profit hospitals are on an “unsustainable path.”
Consequently, median operating margins dropped to an all-time low of 1.6 percent in 2017. More than 28 percent of hospitals posted operating losses last year, up from 16.5 percent in 2016. Of course, operating losses cannot be sustained forever. If they are sustained for multiple years, closure of the hospital frequently results. Earlier this year, Morgan Stanley concluded that 18 percent of U.S. hospitals are at risk of closure or are weak financially, with approximately 8 percent of hospitals (roughly 450 facilities) presently at risk of closing. To put that figure in perspective, during the past five years, only 2.5 percent (150 hospitals) have closed. Also, Morgan Stanley found that 10 percent of hospitals suffer from weak finances.
Various factors account for not-for-profit hospitals’ financial difficulties.
Because the vast majority of net patient revenue came from fee-for-service based payment models—such as DRG payment, fee schedule, percentage of the chargemaster, or list price—overall reduced payment rates adversely impacted revenue in 2017. To be clear, nominal payment rates did not decline—e.g., Medicare’s Inpatient Prospective Payment System and Outpatient Prospective Payment System both incorporated nominal year-to-year increases in 2017—but the revenue mix for hospitals did shift from higher-margin commercial payers to lower-margin public payers. Median Medicare and Medicaid payments as a percentage of gross revenue rose to 45.6 percent and 15.5 percent, respectively, in 2017. Furthermore, continuing a five-year trend, public payers’ share of hospital revenue is projected to increase for the foreseeable future, as more of the baby boomers—an obviously large demographic group—reach retirement age and an increasing number of them incur the sizable costs of the last year of life.
In addition, hospital finances were adversely impacted by the continued shift from inpatient to outpatient care, a trend driven by greater competition from ambulatory facilities, such as physician offices and ambulatory surgery centers. Moody’s reported that median outpatient growth rates exceeded inpatient growth rates for the fifth straight year. In her July 25 address to the Commonwealth Club, Seema Verma, administrator of the Centers for Medicare & Medicaid Services, supported the inpatient-to-outpatient shift, stating that Medicare is seeking to avoid “downstream” expenses, such as emergency department (ED) visits and hospital admissions.
Faced with these financial challenges, not-for-profit hospitals have pursued a number of approaches.
Most commonly, they have tried to improve their management of labor and supply costs. However, this strategy—while certainly logical—may be reaching a point of diminishing returns. Lyndean Brick, president and CEO of the Advis Group, a healthcare consulting firm, has concluded: “This is no longer solely about expense reduction. If not-for-profits just focus on that, they will be out of business in the next few years” (Modern Healthcare, Aug. 29, 2018).
Another strategic response has been consolidation—in which small hospitals join a larger health system—to gain more leverage with payers, to accomplish greater economies of scale, to get access to lower-cost capital, and to enhance access to talent.
By Poornima Venkatesan, senior consultant, Virtusa.
In today’s value-based care environment, patient engagement is a vital key to success in clinical outcomes. This is especially true for chronic diseases such as arthritis, where continuous care is necessary because of the disease’s physical, emotional and economic impact on patients. Although the advent of specialty drugs in the past decade has made disease control possible, clinicians still face challenges in patient care because patients’ preferences about therapy aren’t often considered.
Understanding patient goals and expectations
While a clinician’s goal is to achieve remission, a patient’s goal could be clinical or nonclinical and varies depending on their individual characteristics and demographics.
Patients from low-income countries such as Morocco expect access to primary care (never mind rheumatologists), support services and education about the disease. The high expenses related to rheumatoid arthritis (RA) in such countries result in poor treatment compliance, school absenteeism in children and deterioration in quality of life. Comparatively, even with excellent health insurance systems in the United States, one in six adults with RA reduce their medication use because of high out-of-pocket costs. Most patients expect cost-effective care. In wealthier countries like the United Kingdom, patients expect increased social connectedness and family support.
Elderly patients expect reduced pain, fatigue and side effects, whereas young adults expect independence and normalcy from their treatments. Women, who are most affected by RA, might expect a lesser impact on family life and childrearing.
If such multidimensional expectations are not met, patients tend to discontinue their treatment. As new biologics and non-biological complex drugs (NBCDs) are developed, patient adherence is essential in determining both therapeutic and potential adverse effects. Studies reveal that frustration towards the method of drug administration (like self-injection) also impacts adherence. In the U.S alone, the total cost of non-adherence is estimated between $100 billion and $289 billion annually.
Therefore, it is important for the patient and the physician to trust each other and have open discussions about treatment strategies and expectations to ensure better alignment and cooperation.
Measuring patient engagement
The first step towards patient engagement is awareness of their current engagement levels. The patient activation measure (PAM) tool is helpful here. PAM measures the attitude and knowledge of patients about the disease and treatments. Studies have proven that highly activated patients have better outcomes via increased medication adherence, resulting in lower healthcare costs through fewer ED visits, hospital admissions and re-admissions. By continuously monitoring activation levels, providers can measure sustained changes in patient behavior and personalize their care programs.
We can also measure engagement levels by taking advantage of data. Data derived from direct [electronic health records (EHR), claims] and indirect sources (wearables) provide a holistic view of an individual patient. Simple analytics applied to population data can predict patient behavior. For example, analytics can help providers know which patients are likely to miss their appointments, which patients will fill their prescriptions on time, and so on. Detailed patient-based data could also lead to better and more accurate diagnoses and treatments.
By Richard A. Royer, chief executive officer, Primaris.
Back in the day – the late 1960s, when social norms and the face of America was rapidly changing – a familiar public service announcement began preceding the nightly news cast. “It’s 10 p.m. Do you know where your children are?”
Today, as the healthcare landscape changes rapidly with a seismic shift from the fee-for-service payment model to value-based care models, there’s a similar but new clarion call for quality healthcare: “It’s 2018. Do you know where your data is?”
Compliance with the increasingly complex alphabet soup of quality reporting and reimbursement rules – indeed, the fuel for the engine driving value-based car – is strongly dependent on data. The promising benefits of the age of digital health, from electronic health records (EHRs) to wearable technology and other bells and whistles, will occur only as the result of accurate, reliable, actionable data. Providers and healthcare systems that master the data and then use it to improve quality of care for better population health and at less cost will benefit from financial incentives. Those who do not connect their data to quality improvement will suffer the consequences.
As for the alphabet soup? For starters, we’re as familiar now with these acronyms as we are with our own birth dates: MACRA (the Medicare Access and CHIP Reauthorization Act of 2015), which created the QPP (Quality Payment Program), which birthed MIPS (Merit-based Incentive Payment System).
The colorful acronyms are deeply rooted in data. As a result, understanding the data life cycle of quality reporting for MACRA and MIPS, along with myriad registries, core measures, and others, is crucial for both compliance and optimal reimbursement. There is a lot at stake. For example, the Hospital Readmissions Reduction Program (HRRP) is an example of a program that has changed how hospitals manage their patients. For the 2017 fiscal year, around half of the hospitals in the United States were dinged with readmission penalties. Those penalties resulted in hospitals losing an estimated $528 million for fiscal year 2017.
The key to achieving new financial incentives (with red-ink consequences increasingly in play) is data that is reliable, accurate and actionable. Now, more than ever, it is crucial to understand the data life cycle and how it affects healthcare organizations. The list below varies slightly in order and emphasis compared with other data life cycle charts.
Find the data
Capture the data
Normalize the data
Aggregate the data
Report the data
Understand the data
Act upon the data
One additional stage, which is a combination of several, is secure, manage and maintain the data.
Find the data. Where is it located? Paper charts? Electronic health records (EHRs)? Claims systems? Revenue cycle systems? And how many different EHRs are used by providers — from radiology to labs to primary care or specialists’ offices to others providing care? This step is even more crucial now as providers locate the sources of data required for quality and other reporting.
Capture the data. Some data will be available electronically, some can be acquired electronically, but some will require manual abstraction. If a provider, health system or accountable care organization (ACO) outsources that important work, it is imperative that the abstraction partner understand how to get into each EHR or paper-recording system.
And there is structured and unstructured data. A structured item in the EHR like a check box or treatment/diagnosis code can be captured electronically, but a qualitative clinician note must be abstracted manually. A patient presenting with frequent headaches will have details noted on a chart that might be digitally extracted, but the clinician’s note, “Patient was tense because of job situation,” requires manual retrieval.
Normalize the data. Normalization ensures the data can be more than a number or a note but meaningful data that can form the basis for action. One simple example of normalizing data is reconciling formats of the data. For example, a reconciling a form that lists patients’ last names first with a chart that lists the patients’ first name first. Are we abstracting data for “Doe, John O.” or “John O. Doe?” Different EHR and other systems will have different ways of recording that information.
Normalization ensures that information is used in the same way. The accuracy and reliability that results from normalization is of paramount importance. Normalization makes the information unambiguous.
Aggregate the data. This step is crucial for value-based care because it consolidates the data from individual patients to groups or pools of patients. For example, if there is a pool of 100,000 lives, we can list ages, diagnosis, tests, clinical protocols and outcomes for each patient. Aggregating the data is necessary before healthcare providers can analyze the overall impact and performance of the whole pool.
If a healthcare organization has quality and cost responsibilities for a pool of patients, they must be able to closely identify the patients that will affect the patient pool’s risks. Aggregation and analyzing provides that opportunity.
By Ben Flock, chief healthcare strategist, TEKsystems.
As technology advances, so does the healthcare industry, with technological breakthroughs increasing the ability of healthcare professionals to serve their patients, record and transfer patient data and more efficiently complete other tasks necessary to keep the industry moving. IT services provider TEKsystems recently released the results of a survey that polled almost 200 healthcare IT leaders (e.g., IT directors, chief information officers, IT vice presidents and IT hiring managers) in late 2017/early 2018 on a range of key issues, including technology maturity, workforce planning, critical roles and the top trends shaping healthcare IT today.
The results revealed a shifting focus from IT leaders: healthcare is behind the curve on initiatives that have the potential to shape the industry going forward, including artificial intelligence (AI).
Business demand is driving both the interests of IT leaders and the prioritization of AI in healthcare. Value-based care, regulatory mandates and the consumer push for precision/personalized care are driving the business prioritization of AI. These results indicate that while IT leaders know AI in healthcare is the future, they are currently taking a cautious approach to utilizing the technology. This is very likely rooted in security concerns, as there are federal, state and even local mandates dictating the protection and privacy of patient data.
Although cautious, healthcare organizations are actually proceeding on the AI front. As evidence, survey data shows a high percentage of healthcare organizations are in the implementation, evaluation or refining stage with respect to specific technology applications that leverage AI – digital health systems (75 percent) and telemedicine (51 percent). This pragmatic approach to AI will continue, and healthcare organizations will address this emerging industry imperative by providing IT resources, as well as enabling platform technologies and repeatable solutions capabilities in secure applications and solutions that leverage artificial intelligence.
To ensure IT employees are aware of the need to be cautious when implementing AI initiatives, organizations must ensure adequate onboarding and ongoing risk and compliance (R&C) training is provided. An annual “check the box,” activity, R&C training isn’t enough to help employees and third parties manage risk appropriately. The best strategy is to implement a risk-based approach by focusing on higher risk functional areas with direct access to consumers and/or protected health information (PHI), and creating targeted training. Simple education and awareness tactics can dramatically improve compliance when employees and third parties understand how to apply teachings to their area.
By Matthew Fusan, director of customer experience, SA Ignite.
Although the Quality Payment Program (QPP) has been in effect for a year, there continues to be a lot of change in the program as CMS continues to evolve. The new year creates an ideal time to reflect back on what changes we have experienced to date as well as look forward and examine what could happen in 2018 and beyond.
2017: A Year of Regulatory Confusion
As the QPP rolled out, confusion still reigned supreme at both the CMS and HHS levels:
In 2017, CMS ramped up promotion and education for the QPP. Although these efforts have been more aggressive than previous programs, industry studies like the 5th Annual Health IT Industry Outlook Survey and the KPMG-AMA Survey show that clinicians are struggling to understand the program and what they need to do to be successful. In fact, many expect their employer to provide the information and solutions to manage and are not seeking to proactively educate themselves on requirements and improvement strategies.
While clinicians continue to experience confusion, the Department of Health and Human Services (HHS) has not done much to help clarify. CMS Administrator Seema Verma has continued to support the move away from fee-for-service and toward fee-for-value, but has also cancelled two mandatory bundled payment models – the Episode Payment Models and the Cardiac Rehabilitation Incentive Payment Model – and has also removed the mandatory requirement for the Comprehensive Care for Joint Replacement Model.
Although some bundled payment programs have been cut/reduced, the Centers for Medicare and Medicaid Innovation (CMMI) has put out a request for information (RFI) to gather input on patient-centered care and test market-driven reforms. The intent of the initiative is to empower beneficiaries as consumers, provide price transparency, and increase choices and competition. The RFI demonstrates that all models/programs will be watched closely and are subject to change.
2018: More Focus, More Models
While some programs are being cut/reduced, there is still pressure on CMS to accelerate new Advanced Alternative Payment Models (APMs) so they are exploring options during 2018.
The first option is to allow clinicians to use Medicare Advantage plans to meet the criteria for an Advanced APM. Even though this may require a change to the MACRA legislation, CMS has a demonstration project in the 2018 final rule to explore this option.
Another option is the second iteration of the Bundled Payments for Care Improvement (BCPI) program, Advanced BCPI. The risk levels for other Advanced APM options may appear to be too high for physician practices so this option may have wider appeal to physician groups.
Other models under consideration include Direct Primary Care, which is based on a non-insurance model, as well as collaborations with private payers.
While these models are all under consideration/in development, it will be interesting to see if the CMMI RFI will drive additional choice or will the changes proposed consume CMMI for 2018 and reduce the capacity to introduce new models. Either way, CMMI will look very different in 2018 and beyond.
2019: Change is Mandated
In 2019, critical components of MIPS are mandated, including:
The weighting of the MIPS categories is re-balanced so the Cost category becomes 30 percent of the total MIPS score. While organizations have struggled with optimizing cost for years with limited results, this increase in the Cost category means that 2019 will drive organizations to look closely at cost and create a strategy for measurable improvement.
The performance threshold that determines who receives an incentive versus a penalty will increase to the mean/or median of participants’ scores. In the 2018 final rule, CMS estimated that almost 75 percent of clinicians will earn a score greater than 70 points for 2018 so competition going into 2019 will be fierce, with healthcare organizations pitted against each other to earn high scores and financial incentives.
Joanna Gorovoy, senior director product and solutions marketing, Axway.
To accelerate the shift toward value-based care – organizations across the healthcare ecosystem must find new ways to unlock value from an ever-expanding array of data sources to create data-rich digital services and experiences that improve patient engagement, enable delivery of more personalized healthcare services, and increase clinical collaboration and care coordination across the patient journey. Developers play a key role in accelerating innovation that will shape the future of healthcare and positively impact patient outcomes. But innovating at the speed of digital is challenging in an industry that has long been plagued by interoperability challenges, a prevalence of legacy, siloed systems and applications, and heightened data privacy and security requirements which hinder digital projects. As a result, there are a few key things developers should keep in mind when designing for today’s healthcare market.
You can’t spell interoperability without A-P-I
The frustrations associated with sharing information have burdened the healthcare industry’s digitization efforts for many years. With application programming interfaces (APIs) taking hold, however, data exchange is now easier to accomplish. APIs are revolutionizing data sharing by making it possible to bridge legacy IT systems of record, such as electronic heath records (EHRs), with modern systems of digital engagement, such as mobile apps. Healthcare developers must take an API-first approach and will need to gain knowledge of the latest healthcare interoperability standards – such as FHIR. FHIR (Fast Healthcare Interoperability Resources) is an HL7 standard that simplifies the exchange of healthcare information and promotes the use of APIs to support light-weight integration, facilitating secure data access and interoperability. As healthcare developers increasingly leverage APIs to move beyond some of the challenges associated with secure data sharing and opening up proprietary EHR systems, this will result in faster time to market for innovative digital services and experiences.
Create a sound security strategy
Security must always be top of mind for healthcare developers. Before writing a single line of code, healthcare developers should familiarize themselves with HIPAA regulations that protect all personal health data transactions and impose hefty penalties for violations. As developers design apps that leverage patient health data from a variety of sources, they need to take the time to understand how this law works and must be mindful of how to mitigate security concerns. Adopting a full lifecycle API management solution enables developers to secure and manage FHIR and other healthcare APIs in a unified way across projects and communities, ensure data security and streamline compliance and help reduce the data security burden by using built in, configurable audit trails and reporting.
Inviting external innovation
Healthcare organizations are increasingly looking to invite open innovation into their organizations as they struggle to keep pace with digital transformation. Organizations such as Kaiser Permanente, Johnson and Johnson and Stanford, for example, have hosted developer challenges and hackathons to stimulate innovation and bring in fresh perspectives from developers outside of their organization/industry to help tackle big problems such as healthcare access and affordability. As the industry struggles with IT modernization challenges, developers who have experience working across multiple industries can provide a fresh point of view and can contribute skills and approaches they have gained developing applications for other industries/use cases to provide value to healthcare.
Guest post by Joanna Gorovoy, senior director product and solutions marketing, Axway.
Healthcare organizations need to unlock the value of their data
In 2018, the healthcare industry will accelerate its shift toward value-based healthcare as the industry struggles to address challenges associated with rising cost burdens, an explosion of data and increased mobility. Along with evolving government policy, organizations across the healthcare ecosystem will face a rise in healthcare consumerism as patients bear more risk, face higher out of pocket costs, and demand more value.
Unlocking the value of a wealth of patient data will be key to improving patient engagement, delivering more personalized healthcare products and services, and improving collaboration and care coordination across the patient journey – all critical to enabling value-based care delivery and improving outcomes.
In 2018 AI goes from science fiction to reality in healthcare
Population health and precision medicine are among the initiatives where AI is expected to have the greatest impact. Based on a recent HIMSS study: About 35 percent of healthcare organizations plan to leverage artificial intelligence within two years — and more than half intend to do so within five. Focusing AI investments on population health, clinical decision support, patient diagnosis and precision medicine supports the industry shift toward value-based, personalized care models and reinforces the use of AI to augment intelligence and skills of physicians and drive efficiency in diagnosis and treatment.
Some current use cases include: Enhancing speed and accuracy of diagnosis medical imaging, supporting surgeon workflow and decision-making during (e.g. spine implants), virtual assistants to enhance interactions between patients and caregivers to improve the customer experience and reduce physician burnout, and digital verification of insurance and claims information.
Guest post by Abhinav Shashank, CEO and co-founder, Innovaccer.
A personal health record of any patient, whether it is an aging parent, a spouse or a child with a chronic illness, contains a summary of medications, lab results, visit notes, billing information and more, and interoperability makes it easy to manage all these files and documents with just a few clicks.
Every form of health data makes an entry in an EHR today thanks to the shift towards a digitized healthcare in U.S. Although this has made data entry, storage, retrieval and exchange easier, it has brought with it certain challenges. Integrating and utilizing EHRs is the first baby step; however, if we are to overcome all the hurdles then achieving 100 percent EHR interoperability is the summit where we are yet to reach.
Physicians want to optimize the full potential and promise of EHRs for the simple reason that improved communication between systems will lead to a better and enhanced care. Once all the systems in use nationwide are connected and interacting with each other, patients will find it easier to seek a second opinion as their health information will reach the physician in a matter of seconds.
How interoperability exists today
Today, various interoperability standards have developed for the sake of continuous improvement in this realm. Health Level Seven (HL7) has produced the likes of HL7v2, HL7v3, and the latest FHIR as competent standards that exist in the industry for better streamlining of documentation and care coordination. With the help of FHIR, physicians can access health data on their mobile phones through various API (Application Programming Interface) functions that FHIR supports. This ease of access to complete and accurate patient data, in due course, helps in many ways. As providers and health coaches work together on improving the health of people, it also significant for them to be able to access accurate data from sources other than EHRs. Apart from EHRs, HIEs have popped up in various places that allow for the smooth flow of data across the health care network.
Ways in which interoperability facilitates healthcare
Physicians can easily access and share medical information with their patients and perform their tasks with greater efficiency. This could be done by increasing the efficiency of monitoring chronic diseases. Besides saving time and labor cost, physicians and patients with access to interoperable health information can benefit from higher-quality patient outcomes. Interoperable EHRs carry the potential of giving easy and ongoing access to patient’s health records to the physician. For a doctor to have an updated and detailed medical history of his patient cannot just be live-saving, it will mainly help those people who are always on the move. This will empower an individual to move across the continuum of care seamlessly with their clinical record.
Doing more with less
As value-based care and reimbursements stepped into healthcare, the US managed to turn the tide towards a more qualitative and equitable delivery of care. This has made physicians more responsible for better patient health outcomes than ever before. To manage hospital readmission and managed care plans, physicians need to have as much patient information as possible at hand at all times. This is where interoperability comes into play by aggregating and relaying data from disparate regions and bringing it onto a single platform.
For a secure data exchange to take place amongst healthcare organizations and patients, it’s important that both parties are willing and equally involved in the sharing process. This will inevitably lead to shared decision making apart from the fact that the physician will be able to make quick and informed decisions. The ultimate aim is to have a complete understanding of the health status of patients and helping them navigate effectively in their health journey for a better patient experience.
Patient-centric interoperability is the direction in which healthcare is slowly moving. There’s so much that we can do with the availability of data. Ongoing monitoring of patient data can better facilitate the ongoing management of that patient’s health and the physician can intervene where necessary. With this, patients too can track their progress and work towards improving their health hand-in-hand with the physicians.
Challenges that interoperability is yet to solve
One of the issues that interoperability is dealing with today is the vast and disjointed patient data that exists in regional HIEs and independent, transactional databases like EHRs. Along with this, patient privacy concerns and consent are other risk factors that need to be considered when diving through protected health information data. Lack of a common standard, state policy rules, workflow and policy difference and the need for incentives are some barriers in the way of achieving 100 percent interoperability.
Guest post by Abhinav Shashank, CEO and co-founder, Innovaccer.
The world of healthcare analytics is vast and can encompass a wide range of data that has the incredible potential to tell stories about health and healthcare delivery: right from individual patients to entire populations. Having numbers and an easy-to-use visualization at hand gives providers and caregivers the power to not only look into the lives of individual patients but also track the ongoing activities in their organizations. Simply showing visualizations are not enough and to fully understand their value, healthcare organizations have to take a few steps beyond basic graphs.
The Case for Data Visualization
In the words of Edward O. Wilson, the father or social biology:
“You teach me, I forget.
You show me, I remember.
You involve me, I understand.”
There are many disparate data sources healthcare providers have to deal with: EHRs, departmental data, claims data, resource utilization, administrative data, etc. Consolidating the data and spreading it out in a visually adaptive manner offers a more agile approach to managing complex population health data.
Data visualization was developed with the aim to make it easier to gain actionable insights from volumes of information and work on improving health programs, clinical healthcare delivery, and post-episode care management. Visualization provides real value in learning from disparate data sources, finding outliers, bringing out hidden trends out on the front, and delivering better health outcomes.
Streamlining Different Data Sources into a Single Source of Truth
Since the data pertaining to a patient’s health comes in from various sources, it is vital to pool all the data sets and obtain an aggregated, standard format of data every authorized person can view and manipulate.
Data in the healthcare industry can broadly be categorized into two sources:
Claims data: that comes from payers and contains extremely uniform and updated data about the care patients receive and how they are billed for it. This data is usually structured and has all the meaningful data required for provider reimbursement.
Clinical data: this data comes in from the providers’ end and contains valuable information about their diagnoses, claims, and medical history. While this data isn’t often structured, incorporates data elements critical to analyze a patient’s health in every time frame.
Fine-tuning Real-Time Visualization
The amount of data healthcare institutions aggregate is enormous: by 2012, it was estimated to be a whopping 150 exabytes (150 million * million * million) and is growing at a rate of 48 percent per year. As the volume grows, healthcare organizations need state-of-the-art, real-time analytical capabilities to improve the care quality and its effectiveness. Real-time analytics can turn the tables in ways more than one:
Monitoring end-to-end care delivery across a wide range of facilities.
Observing the progress of clinical decision support systems.
Identifying overhead cost drivers and detect care or documentation gaps.
Since data visualization holds great advantage to understand the going-ons in the organization in real-time, here are some key elements that count as best practices for data visualization:
Customized reports: Each set of users in healthcare requires different metrics and different orders. Offering customized reports with specific visualization provides actionable insights and can answer specific questions about risks, rewards, and success of the organization.
Visually adaptive: Data presented on the dashboards has to be complete with functional and visual features that aim to improve cognition and quick interpretation. Data listed in a color coded-manner will provide physicians with functional features and real-time alerts.
Create actionable insights: A dashboard or any other visualization tool will provide clinicians with the data, but unless someone looks at it, it will go unnoticed and may have potentially critical outcomes. Users should be made aware of how to review the dashboard, drill down to every immediate level, and initiate corrective actions.
The end user’s ultimate need: It’s paramount that end users can communicate their needs and demands and what is even more important is that their demands and performance indicators are incorporated well in advance of structuring the report.
Wrap-up with Healthcare IT
By leveraging healthcare IT, organizations can have their hands on simple but effective visualization and take a look at additional, important information that might have been difficult to notice in tabular format. Here are some ways healthcare IT can drive real-time data visualization to success:
Immediate access and sharing: Putting bidirectional interoperability to use, providers can access and share relevant data across the network, despite technological barriers.
Clear data visualization: Graphic, color-coded cues help physicians swiftly learn about the areas that need performance improvement or track the growth their organization is making.
Drilling down: To learn more about the reason behind certain shortfall, physicians can always drill down and narrow their area of focus to pinpoint the anomaly, and take quick remedial actions.
Driving Value with Visualization
With healthcare IT now an integral part of the value-based care system, there is little doubt that convenient, real-time data visualization will be heavily used to achieve positive health outcomes. Combining real-time data with advanced analytics will completely reshape how healthcare IT can improve clinical and operational outcomes. Once physicians move away from long, incomprehensible data flows, and find an alternative that helps them instinctively read, isolate, and act upon the insights, only then can we be one step closer to a data-driven value-based care.