Tag: data capture

The Health Data Life Cycle: 7 Key Stages To Success In Value-Based Care

By Richard A. Royer, chief executive officer, Primaris.

Richard Royer
Richard Royer

Back in the day – the late 1960s, when social norms and the face of America was rapidly changing – a familiar public service announcement began preceding the nightly news cast. “It’s 10 p.m. Do you know where your children are?”

Today, as the healthcare landscape changes rapidly with a seismic shift from the fee-for-service payment model to value-based care models, there’s a similar but new clarion call for quality healthcare: “It’s 2018. Do you know where your data is?”

Compliance with the increasingly complex alphabet soup of quality reporting and reimbursement rules – indeed, the fuel for the engine driving value-based car – is strongly dependent on data. The promising benefits of the age of digital health, from electronic health records (EHRs) to wearable technology and other bells and whistles, will occur only as the result of accurate, reliable, actionable data. Providers and healthcare systems that master the data and then use it to improve quality of care for better population health and at less cost will benefit from financial incentives. Those who do not connect their data to quality improvement will suffer the consequences.

As for the alphabet soup? For starters, we’re as familiar now with these acronyms as we are with our own birth dates: MACRA (the Medicare Access and CHIP Reauthorization Act of 2015), which created the QPP (Quality Payment Program), which birthed MIPS (Merit-based Incentive Payment System).

The colorful acronyms are deeply rooted in data. As a result, understanding the data life cycle of quality reporting for MACRA and MIPS, along with myriad registries, core measures, and others, is crucial for both compliance and optimal reimbursement. There is a lot at stake. For example, the Hospital Readmissions Reduction Program (HRRP) is an example of a program that has changed how hospitals manage their patients. For the 2017 fiscal year, around half of the hospitals in the United States were dinged with readmission penalties. Those penalties resulted in hospitals losing an estimated $528 million for fiscal year 2017.

The key to achieving new financial incentives (with red-ink consequences increasingly in play) is data that is reliable, accurate and actionable. Now, more than ever, it is crucial to understand the data life cycle and how it affects healthcare organizations. The list below varies slightly in order and emphasis compared with other data life cycle charts.

One additional stage, which is a combination of several, is secure, manage and maintain the data.

Find the data. Where is it located? Paper charts? Electronic health records (EHRs)? Claims systems? Revenue cycle systems? And how many different EHRs are used by providers — from radiology to labs to primary care or specialists’ offices to others providing care? This step is even more crucial now as providers locate the sources of data required for quality and other reporting.

Capture the data. Some data will be available electronically, some can be acquired electronically, but some will require manual abstraction. If a provider, health system or accountable care organization (ACO) outsources that important work, it is imperative that the abstraction partner understand how to get into each EHR or paper-recording system.

And there is structured and unstructured data. A structured item in the EHR like a check box or treatment/diagnosis code can be captured electronically, but a qualitative clinician note must be abstracted manually. A patient presenting with frequent headaches will have details noted on a chart that might be digitally extracted, but the clinician’s note, “Patient was tense because of job situation,” requires manual retrieval.

Normalize the data. Normalization ensures the data can be more than a number or a note but meaningful data that can form the basis for action. One simple example of normalizing data is reconciling formats of the data. For example, a reconciling a form that lists patients’ last names first with a chart that lists the patients’ first name first. Are we abstracting data for “Doe, John O.” or “John O. Doe?” Different EHR and other systems will have different ways of recording that information.

 Normalization ensures that information is used in the same way. The accuracy and reliability that results from normalization is of paramount importance. Normalization makes the information unambiguous.

Aggregate the data. This step is crucial for value-based care because it consolidates the data from individual patients to groups or pools of patients. For example, if there is a pool of 100,000 lives, we can list ages, diagnosis, tests, clinical protocols and outcomes for each patient. Aggregating the data is necessary before healthcare providers can analyze the overall impact and performance of the whole pool.

 If a healthcare organization has quality and cost responsibilities for a pool of patients, they must be able to closely identify the patients that will affect the patient pool’s risks. Aggregation and analyzing provides that opportunity.

Continue Reading