By Heather Annolino, senior director healthcare practice, Ventiv.
In recent weeks the U.S. has experienced a significant increase in new COVID-19 cases. For healthcare facilities in these regions, this is a constant reminder that things are “not business as normal” and has resulted in administrators needing to continually monitor the changing COVID-19 response landscape to reduce risks which could affect the quality of patient care.
With this in mind, patient safety is more important than ever. The ability for healthcare organizations to implement predictive analytics and data-discovery tools that identify hidden patterns and trends is essential. This allows them to focus on interventions and changes in processes, detect vulnerabilities, and increase preparedness before, during and after an incident to further decrease patient harm.
Moving forward, healthcare organizations must embrace a heightened level of risk management to provide an environment free from harm. These new risks, along with gaps in longstanding processes, require better risk management and patient safety systems with the ability to capture, track, and analyze data in real-time to enhance processes that will mitigate future risks.
Working as a centralized reporting tool, these systems can also remove any biases to assist with making enhanced decisions to continually drive operational efficiencies.
Here are three system requirements for an effective, integrated patient safety tool needed for healthcare leaders to elevate care, enhance quality and reduce risk throughout different phases of the pandemic.
According to a new survey fielded by Definitive Healthcare and sponsored by Dimensional Insight, 90% of hospitals and health systems use the analytics component of their electronic health records (EHRs), with 49% using it exclusively or primarily for analytics. With such widespread use, the technology must be meeting the needs of hospitals and health systems, right?
The survey data shows that despite the fact that many hospitals are using EHR analytics, they are also challenged by the technology and give it middling rates when it comes to satisfaction. Let’s look at the survey results in more detail and examine where hospitals and health systems go from here.
Hospitals not highly satisfied with EHR analytics
The survey interviewed 108 healthcare leaders on their experience with EHR analytics. It also asked about their experience with analytics-specific platforms and in-house solutions to serve as a comparison point.
Overall, leaders ranked their satisfaction with EHR analytics as a 5.58 (on a scale of 0-10 with 0 being “extremely dissatisfied” and 10 being “extremely satisfied”). In-house solutions received a satisfaction score of 6.51 (17% higher) and analytics-specific platforms received a score of 6.69 (20% higher).
Leaders feel challenged by technology aspects of EHR analytics. For organizations that are using EHR analytics as their primary analytics tool, they feel challenged by:
The reporting and querying is difficult/slow (43.4%)
The component is not robust or advanced enough (35.8%)
Lack of visualization (28.3%)
User interface is difficult to understand/use (26.4%)
Those that are not using EHR analytics cite similar technology challenges as the reason they are not using the component.
Ever increasing computational power, advances in artificial intelligence, and the lower of the cost computation (because of cloud computing service, such as Azure and Amazon Web Services) has enabled healthcare systems – often laggards in quality improvement and technology adoption – to rapidly implement analytics systems. Such systems enable enterprises to analyze and model their processes, engage in meaningful quality and process improvement activities, and prepare to succeed in value and risk-based payment models.
Hewlett Packard Enterprises recently published a piece that delineated some of the benefits that enterprises can gain from analytics (specifically the predictive form):
Gauging operating room (OR) demand
Better manage supply chains
Intervene with care pathways prior to adverse events occurring.
Enterprises with existing, legacy analytics systems – for example those that mainly work with claims-based data or lack predicative or real-time capabilities can likewise obtain the above efficiencies. A modern data warehouse must be flexible, SQL-enabled, cloud-based, and highly secure. Snowflake Computing’s cloud-based infrastructure is an example of one such system which can be easily scaled as it is offered to clients with usage-based pricing. A data warehouse alone, however, is not sufficient for an enterprise. Tools must be provided to prep, transform, and perform analysis on the data. Alteryx Designer, one such tool, allows analysts to prep and blend data from heterogenous sources – e.g., CSVs, databases, Excel files — in an efficient and reproducible manner, and, more importantly, it includes spatial and predictive analytics. This enables organizations to move from retrospective and barely actionable data to immediately actionable real-time predictive analytics.
Get leadership buy-in – Many people naturally resist change. Ensuring that leadership across the enterprise is committed to the change will enable a coherent messaging to be addressed to all stakeholders. There are many strategies to achieve leadership buy-in. A notable one, the ADKAR model is described here.
Choosing an effective partner – Especially for mission-critical or strategically sensitive projects, external help is critical. Talented consultants can augment staff skill shortages and bring critical experience (and lessons learned from other projects).
Be there and integrate training creatively – Project leaders should spend time onsite at the various locations where work occurs to ensure proper training and data conversion. During training, don’t just rely on classroom style training; rather, sit down with users and work through actual day-to-day problems. Consider also setting up open office hours where super users or hired technology partners can guide users through specific day-to-day processes.
Train Superusers – A successful analytics systems empowers users – especially key super users – to use the application on their own and not to depend on report requests to an analytics department.
Be honest and use humor – The latter can assist in convincing people to give a new system a chance. Honestly builds rapport within an organization especially during a challenging project. If one is converting from a legacy analytics system to a new one, it is important to empathize with users. They have been doing their work on the old system for years; their apprehension is natural.
Make friends with problem persons but acknowledge that not everyone will accept change – Try working alongside so-called problem persons. It will help you as a project leader to determine why they are negative and show that you are empathetic to their concerns and are personally invested in their successful transition. Note, however, there will be a minority of users that will refuse to accept the change. For the project to be successful, it may be necessary to move on and hope that they come around once the project is successful.
Be a warrior and ignore borders – Sometimes it is important to put a stop to delaying tactics such as an abundance of meetings and just move forward. Additionally, such assertiveness must be used to modify the scope of the project if it is necessary to keep the organization functioning.
Guest post by Abhinav Shashank, CEO and co-founder, Innovaccer.
The world of healthcare analytics is vast and can encompass a wide range of data that has the incredible potential to tell stories about health and healthcare delivery: right from individual patients to entire populations. Having numbers and an easy-to-use visualization at hand gives providers and caregivers the power to not only look into the lives of individual patients but also track the ongoing activities in their organizations. Simply showing visualizations are not enough and to fully understand their value, healthcare organizations have to take a few steps beyond basic graphs.
The Case for Data Visualization
In the words of Edward O. Wilson, the father or social biology:
“You teach me, I forget.
You show me, I remember.
You involve me, I understand.”
There are many disparate data sources healthcare providers have to deal with: EHRs, departmental data, claims data, resource utilization, administrative data, etc. Consolidating the data and spreading it out in a visually adaptive manner offers a more agile approach to managing complex population health data.
Data visualization was developed with the aim to make it easier to gain actionable insights from volumes of information and work on improving health programs, clinical healthcare delivery, and post-episode care management. Visualization provides real value in learning from disparate data sources, finding outliers, bringing out hidden trends out on the front, and delivering better health outcomes.
Streamlining Different Data Sources into a Single Source of Truth
Since the data pertaining to a patient’s health comes in from various sources, it is vital to pool all the data sets and obtain an aggregated, standard format of data every authorized person can view and manipulate.
Data in the healthcare industry can broadly be categorized into two sources:
Claims data: that comes from payers and contains extremely uniform and updated data about the care patients receive and how they are billed for it. This data is usually structured and has all the meaningful data required for provider reimbursement.
Clinical data: this data comes in from the providers’ end and contains valuable information about their diagnoses, claims, and medical history. While this data isn’t often structured, incorporates data elements critical to analyze a patient’s health in every time frame.
Fine-tuning Real-Time Visualization
The amount of data healthcare institutions aggregate is enormous: by 2012, it was estimated to be a whopping 150 exabytes (150 million * million * million) and is growing at a rate of 48 percent per year. As the volume grows, healthcare organizations need state-of-the-art, real-time analytical capabilities to improve the care quality and its effectiveness. Real-time analytics can turn the tables in ways more than one:
Monitoring end-to-end care delivery across a wide range of facilities.
Observing the progress of clinical decision support systems.
Identifying overhead cost drivers and detect care or documentation gaps.
Since data visualization holds great advantage to understand the going-ons in the organization in real-time, here are some key elements that count as best practices for data visualization:
Customized reports: Each set of users in healthcare requires different metrics and different orders. Offering customized reports with specific visualization provides actionable insights and can answer specific questions about risks, rewards, and success of the organization.
Visually adaptive: Data presented on the dashboards has to be complete with functional and visual features that aim to improve cognition and quick interpretation. Data listed in a color coded-manner will provide physicians with functional features and real-time alerts.
Create actionable insights: A dashboard or any other visualization tool will provide clinicians with the data, but unless someone looks at it, it will go unnoticed and may have potentially critical outcomes. Users should be made aware of how to review the dashboard, drill down to every immediate level, and initiate corrective actions.
The end user’s ultimate need: It’s paramount that end users can communicate their needs and demands and what is even more important is that their demands and performance indicators are incorporated well in advance of structuring the report.
Wrap-up with Healthcare IT
By leveraging healthcare IT, organizations can have their hands on simple but effective visualization and take a look at additional, important information that might have been difficult to notice in tabular format. Here are some ways healthcare IT can drive real-time data visualization to success:
Immediate access and sharing: Putting bidirectional interoperability to use, providers can access and share relevant data across the network, despite technological barriers.
Clear data visualization: Graphic, color-coded cues help physicians swiftly learn about the areas that need performance improvement or track the growth their organization is making.
Drilling down: To learn more about the reason behind certain shortfall, physicians can always drill down and narrow their area of focus to pinpoint the anomaly, and take quick remedial actions.
Driving Value with Visualization
With healthcare IT now an integral part of the value-based care system, there is little doubt that convenient, real-time data visualization will be heavily used to achieve positive health outcomes. Combining real-time data with advanced analytics will completely reshape how healthcare IT can improve clinical and operational outcomes. Once physicians move away from long, incomprehensible data flows, and find an alternative that helps them instinctively read, isolate, and act upon the insights, only then can we be one step closer to a data-driven value-based care.
The Affordable Care Act (ACA) produced a wealth of data from its first two years in operation. Health actuaries voraciously consumed that data, using predictive modeling techniques to solve healthcare industry problems that have never been seen before. While we don’t yet know how the ACA may change, I know actuaries will find solutions, because we thrive in the realm of the uncertain.
Actuaries have always been in the business of data. Centuries ago the work involved scanning clerical ledgers to create the first mortality tables. Today, human activity, including healthcare, is far more complex. Every two days, we create more data than was created from the dawn of civilization through the year 2000.
A significant portion of my recent work has involved studying ACA data, particularly deconstructing a health plan’s performance using the prism of risk adjustment.
Risk adjustment used to be a niche on the spectrum of a healthcare actuary’s work. However, since the ACA risk adjustment program is now a permanent fixture – for the time being – in commercial individual and small group markets, it is the focus of many actuaries’ every day work. Risk adjustment involves adjusting a health plan’s revenue based on a measure of morbidity of the average member enrolling with the plan. It aims to mitigate incentives to select low-risk populations, and instead re-focus the basis of competition on other factors such as quality, efficiency, and benefits delivered.
The program presents a great opportunity for actuaries to apply predictive modeling concepts on large scale data to deliver actionable insights to clients and employers. From the predictive modeling work, actuaries have learned that risk adjustment renders seemingly intuitive notions of health plan performance and profitability rather meaningless. For example, sicker and costlier individuals may have threatened a health plan’s viability in the past. But that may not necessarily be the case going forward.
As the healthcare industry continues to become simultaneously more patient-centered as well as more performance-oriented, healthcare organizations and biotech companies alike are taking a closer look at how they can improve clinical quality measures. Although the industry has been widely criticized for a lack of meaningful, uniform industry standards, there’s no denying the link between understanding clinical effectiveness and improving overall patient outcomes. To truly assess quality, organizations need to make sense of the myriad of real-world evidence (RWE) data they already have at their fingertips.
RWE data enables a comprehensive understanding of data physician utilization patterns, patient treatment options, drug comparative effectiveness and more. However, the current, typical approach to RWE – a vast array of siloed databases, services-dependent, with access restricted to just two or three “power users” – has shown to be utterly ineffective. In fact, market estimates suggest big pharma spends $20 million dollars on average annually on RWE, but they are still no closer to fully understanding the real-world impact of pharmacologic and non-pharmacologic treatment on patients.
The problem is not a lack of data, but rather an inability to access RWE data quickly by the very people who are best suited to make sense of the information. Current strategies and tools simply cannot access, analyze, and deliver insights quickly enough for the information to be of use to the organization. However, new approaches to data analytics are ready to eliminate these historical roadblocks and transform RWE data into meaningful insights that can help measure clinical quality effectiveness.
Leveraging cloud-based analytics is one such approach. These solutions are increasingly becoming a critical tool to uncover how quality care initiatives are progressing. Unlike tools of the past, cloud-based offerings can provide rapid access to the data and derived insights in the language that resonates most when measuring quality. For instance, delivery via the cloud enables the real-time scalability necessary for RWE data. As the variety, volume and velocity of RWE data continues to increase, on-premises solutions simply cannot scale quickly enough to contend with terabytes of data and the analytic demands of its users.
Guest post by Nora Lissy, RN, BSN, MBA, director of healthcare information, Dimensional Insight.
A recent report from Research and Markets predicts that the healthcare data analytics sector will grow to more than $34.27 billion by the end of 2022. This is indicative of how hospitals and health systems are realizing the intrinsic value in an analytics capability—which can be leveraged for everything from capturing information to interpreting the data—to make more informed care decisions. From a provider standpoint however, many physicians are still struggling to close the gap between turning data insights into actionable care improvements. For example, looking at a data set of former pharmaceutical plans for patients with asthma and using the information to make a more informed prescribing decisions for a current patient.
So what can healthcare organizations do to help bridge the divide between the clinical staff and the IT department to make it easier for doctors and clinicians to see how analytics can be applied in their day-to-day care routine? To start, they need to identify which members of their clinical staff have a “data mind” and can easily see how data can be turned into care improvement. For example, looking at an analysis of a hospital’s patient care transitions and adjusting patient handoffs to be more streamlined across departments. A person in this role can communicate to both the clinical and IT sides of the house why data needs to be presented in a certain way and where care adjustments and enhancements can be made.
So how can you find this diamond in the rough who has the ability to turn providers’ “Medical Minds” into “Data Docs” of tomorrow? Here are three tips to help determine who the best person at your organization would be to help fill these shoes:
Who is your organization’s “go-to?”
Who is the one person in your organization/department who everyone goes to with questions? It can be anything from a question about a schedule change or process to a new patient’s medication history. In many cases, this doesn’t even need to be someone from the clinical side or from the IT side. It simply should be someone who has a global view of the organization and who is familiar with the clinical side and has an understanding of what needs to happen on the technical side.
Who has good business intuition?
Someone who has a natural knack in the business world also typically has a data-oriented mindset. This is someone who is not afraid to question the reasoning behind certain recommendations and processes. This is not to say that this individual is counterproductive, but instead is the problem solver. Much like the organization’s go-to contact, this person also sees the full organizational picture rather than just through the lens of the department that they work for and are instilled with ability to translate the business and operational needs into the technical needs.
The health IT revolution is here and 2016 will be the year that actionable data brings it full circle.
Opportunities to achieve meaningful use with electronic health records (EHRs) are available and many healthcare organizations have already realized elevated care coordination with healthcare IT. However, improved care coordination is only a small piece of HIT’s full potential to produce a higher level synthesis of information that delivers actionable data to clinicians. As the healthcare industry transitions to a value-based model in which organizations are compensated not for services performed but for keeping patients and populations well, achieving a higher level of operational efficiency is what patient care requires and what executives expect to receive from their EHR investment.
This approach emphasizes outcomes and value rather than procedures and fees, incentivizing providers to improve efficiency by better managing their populations. Garnering actionable insights for frontline clinicians through an evolved EHR framework is the unified responsibility of EHR providers, IT professionals and care coordination managers – and a task that will monopolize HIT in 2016.
The data void in historical EHR concepts
Traditionally, care has been based on the “inside the four walls” EHR, which means insights are derived from limited data, and next steps are determined by what the patient’s problem is today or what they choose to communicate to their caregiver. If outside information is available from clinical and claims data, it is sparse and often inaccessible to the caregiver. This presents an unavoidable need to make clinical information actionable by readily transforming operational and care data that’s housed in care management tools into usable insights for care delivery and improvement. Likewise, when care management tools are armed with indicators of care gaps, they can do a better job at highlighting those patients during the care process, and feeding care activities to analytics appropriately tagged with metadata or other enhanced information to enrich further analysis.
Filling the gaps to achieve actionable data
To deliver actionable data in a clinical context, HIT platform advancements must integrate and analyze data from across the community—including medical, behavioral and social information—to provide the big picture of patient and population health. Further, the operational information about moving a patient through the care process (e.g., outreach, education, arranging a ride, etc.) is vital to tuning care delivery as a holistic system rather than just optimizing the points of care alone. This innovative approach consolidates diverse and fragmented data in a single comprehensive care plan, with meaningful insights that empowers the full spectrum of care, from clinical providers (e.g., physicians, nurses, behavioral health professionals, staff) to non-clinical providers (e.g., care managers, case managers, social workers), to patients and their caregivers.
Guest post by Mohd Haque, vice president and global business head, healthcare, Wipro Technologies.
Population health management (PHM) isn’t just the latest buzzword. Or a new initiative mandated by the Affordable Care Act. Implementing a successful PHM program requires a complete shift in mindset from volume healthcare to value-based and outcome-based. PHM can’t be something that your healthcare facility “does,” but it must become the cornerstone of everything related to how your facility practices medicine.
Although the shift in perspective is the first step, it is essential to arm yourself with Population Health Management IT tools as well. According to 26th Annual HIMSS Study, half of the respondents (51 percent) have improved PHM through IT tools with only 38 percent saying that their organization was using specific Population Health Management tools.
By using big data analytics, EHR integration, IT infrastructure and security as part of a PHM program, providers can ensure patients that need high levels of care aren’t overlooked and the lower risk patients don’t get unnecessary care. This will in turn increase quality of care while saving money on interventions needed for low risk patients.
What are the Components of Effective PHM Program?
Since PHM is such a large shift, it is important to know exactly how to go about creating an environment that focuses on outcomes instead of volume. Population Health Alliance recommends the following four components to a PHM program:
Assessment – Evaluate each patient’s health and assign patients to a risk group (high to low)
Stratification – Provide the same interventions for everyone in the same risk group
Person-Centered Intervention – Provide interventions based on each specific patient’s needs, including community health research
Impact Evaluation – Determine the impact of interventions for each risk group as well as each individual patient
However, you can’t simply change the process without changing how each person on the team views healthcare and their patients. It must be a fundamental shift in your facility from the receptionist to the department chief.
GenoSpace has built a comprehensive platform for genomic and other biomedical data. Its serves research, clinical development, pathology and clinical care customers who work with high-dimensional genomic and other biomedical data.
At GenoSpace, we are digital architects of genomic medicine. GenoSpace has built a comprehensive platform for precision medicine to enable interpretation, analysis, reporting and collaboration on high-dimensional genomic and other biomedical data. With specific applications supporting research, development, pathology and clinical care, many of the most advanced precision medicine organizations are powered by GenoSpace.
GenoSpace has built a comprehensive software platform for genomic medicine. The company leverages a single storage and security platform to deliver a variety of applications and services including reporting, population analytics, clinical trial matching, interactive physician and patient portals, and patient communities. GenoSpace specializes in data integration, modeling, analysis, interpretation, visualization and collaboration. GenoSpace applications serve users at laboratories, health systems, and organizations engaged in research and clinical development, as well as patients.
GenoSpace was founded by John Quackenbush and Mick Correll in 2012 after they realized existing platforms for analyzing genomic information could be greatly improved. John is board chair and Mick is chief executive officer.
John received his PhD in theoretical physics from UCLA in 1990. Following a physics postdoc, he was awarded a 1992 NIH Special Emphasis Research Career Award to work on the Human Genome Project. After two years at the Salk Institute and two years at Stanford University, he moved to The Institute for Genomic Research, pioneering gene expression analysis. He joined the Dana-Farber Cancer Institute and Harvard School of Public Health in 2005 and works on reconstruction of gene networks that drive the development of disease. He received a 2013 White House Open Science Champion of Change award for his work on facilitating genomic data access.
Mick’s more than 15-year career has tracked the path of genomics from basic research to clinical care. Over this time he has held leadership positions in academia and industry, and has developed informatics solutions for pharma/biotech R&D, ag-bio, and academic, government and community healthcare providers. Prior to launching GenoSpace, he and partner John Quackenbush established the Center for Cancer Computational Biology at the Dana-Farber Cancer Institute. Mick began his career at Lion Bioscience Research Inc. and UK-based informatics provider InforSense. Mick earned a BS in Computer Science and BA in Molecular Biology from the University of Colorado at Boulder.
Market opportunity (in your particular space–numbers, competitors, etc. are helpful)
The market for interpreting and analyzing molecular and other biomedical data is a multi-billion dollar component of the broader molecular diagnostics and applicable life science research markets—each of which represent tens of billions of dollars per year. GenoSpace’s target customers are innovators in lab medicine, hospitals and health systems, and research and clinical development organizations. While there are several competitors for individual offerings provided by GenoSpace, the company is unmatched in its cloud-based architecture, comprehensive suite of offerings and experience delivering those offerings to real customers.