Category: Editorial

Releasing the Power of Big Data through Proper De-Identification

Guest post by Lucy Doyle, Ph.D., vice president, data protection, information security and risk management, McKesson, and Karen Smith, J.D.,CHC, senior director, privacy and data protection, McKesson.

Today there are opportunities and initiatives to use big data to improve patient care, reduce costs and optimize performance, but there are challenges that must be met. Providers still have disparate systems, non-standard data, interoperability issues and legacy data silos, as well as the implementation of newer technologies. High data quality is critical, especially since the information may be used to support healthcare operations and patient care. The integration of privacy and security controls to support safe data handling practices is paramount.

Meeting these challenges will require continued implementation of data standards, processes, and policies across the industry. Data protection and accurate applications of de-identification methods are needed.

Empowering Data Through Proper De-Identification

Healthcare privacy and security professionals field requests to use patient data for a variety of use cases, including research, marketing, outcomes analysis and analytics for industry stakeholders. The HIPAA Privacy Rule established standards to protect individuals’ individually identifiable health information by requiring safeguards to shield the information and by setting limits and conditions on the uses and disclosures that may be made. It also provided two methods to de-identify data, providing a means to free valuable de-identified patient level information for a variety of important uses.

Depending on the methodology used and how it is applied, de-identification enables quality data that is highly useable, making it a valuable asset to the organization. One of the HIPAA- approved methods to de-identify data is the Safe Harbor Method. This method requires removal of 18 specified identifiers, protected health information, related to the individual or their relatives, employers or household members. The 18th element requires removal of any other unique characteristic or code that could lead to identifying an individual who is the subject of the information. To determine that the Safe Harbor criteria has been met, while appearing to be fairly straightforward and to be done properly, the process requires a thorough understanding of how to address certain components, which can be quite complex.

The second de-identification method is the expert method. This involves using a highly skilled specialist who utilizes statistical and scientific principles and methods to determine the risk of re-identification in rendering information not individually identifiable.

We need to encourage and support educational initiatives within our industry so more individuals become proficient in these complex techniques. At McKesson, we are educating our business units so employees can better understand and embrace de-identification and the value it can provide. This training gives them a basic understanding of how to identify and manage risks as well as how to ensure they are getting quality content.

Embracing Social Media and New and Improved Technologies

One of the challenges we face today in de-identifying data is adapting our mindset and methodologies to incorporate new emerging technologies and the adoption of social media. It is crucial to understand how the released data could potentially be exposed by being combined with other available data. New standards are needed.

Closing Thoughts

While de-identifying data can be challenging and complex, the task is made easier when we remember and adhere to our core directive to safeguard data. With this in mind incorporating new technologies is part of an ongoing process of review.

When done properly, de-identification enables high quality, usable data, particularly when the expert method is used. De-identification should not be viewed as an obstacle to data usage, but rather as a powerful enabler that opens the door to a wealth of valuable information.

The Comprehensive ESRD Care Model: The First Disease-Specific ACO Program

Guest post by Ken Perez, vice president of healthcare policy, Omnicell.

Ken Perez
Ken Perez

Under the authority of Section 3021 of the Affordable Care Act (ACA), the Centers for Medicare and Medicaid Services (CMS) has launched a variety of accountable care organization (ACO) initiatives, including the Pioneer ACO Model, the Medicare Shared Savings Program (MSSP), the Advance Payment ACO Model, and the Next Generation ACO Model. ACOs continue to be the most aggressive of the healthcare delivery reforms mandated by the ACA.

Notably, none of the aforementioned ACO models has a disease-specific focus. During the past few years, DaVita Inc., the nation’s second-largest dialysis provider, lobbied CMS diligently for a renal-specific ACO or at least creation of a framework that would allow for a disease-specific approach. DaVita formed the Accountable Kidney Care Collaborative to prepare the nephrology community to participate broadly in general ACOs and/or in disease-specific renal ACOs.

An ACO Program Focused on Renal Disease

On Oct. 7, 2015, the Center for Medicare and Medicaid Innovation (the Innovation Center) made a groundbreaking announcement, launching the Comprehensive ESRD Care (CEC) Model, with its sole focus on end-stage renal disease (ESRD), also known as kidney failure. This disease afflicts more than 600,000 Americans. These individuals require life-sustaining dialysis treatments several times each week. In 2012, ESRD beneficiaries comprised 1.1 percent of the Medicare population and accounted for $26 billion or 5.6 percent of total Medicare spending.

The CEC Model’s first three-year agreement period began on Oct. 1, 2015, with 13 ESCOs in 11 states: Arizona, California, Florida, Illinois, New Jersey, New York, North Carolina, Pennsylvania, South Carolina, Tennessee, and Texas. All except one of the 13 ESCOs are owned by a large dialysis organization (LDO), defined as an organization that owns 200 or more dialysis facilities. Dialysis Clinic, Inc. (DCI), the nation’s largest non-profit dialysis provider, owns three of the ESCOs, as does DaVita. Fresenius, the largest dialysis provider, owns six of the ESCOs. The lone non-LDO is the Rogosin Institute in New York City.

As with all Medicare ACO programs, the CEC Model has both quality measures and expenditure-reduction targets which impact the model’s payment arrangements.

Quality Measures

The CEC Model features 26 quality measures—14 outcome and 12 process—for both LDOs and non-LDOs. The quality measures span five domains: patient safety, person- and caregiver-centered experience and outcomes, communication and care coordination, clinical quality of care, and population health.

Continue Reading

Transparency, Collaboration, Innovation Key to Achieving Nationwide Interoperability

Guest post by Jitin Asnaani, executive director, CommonWell Health Alliance.

Jitin Asnaani
Jitin Asnaani

For decades, the use of paper medical records was the “norm” and the sharing of those records with another provider typically involved a photocopier and a briefcase for the patient to carry them to the next doctor. Today, electronic medical records are becoming the standard, but the exchange of health data between disparate networks and software systems has remained elusive.

While some data exchange is taking place in health care today, it’s only occurring in isolated pockets, typically within one region or health system, making it largely ineffective. Solving this challenge requires transparency, collaboration and innovation for continued success–attributes CommonWell Health Alliance embodies.

Transparency across the Industry

Competition in almost every sector thrives on keeping information separate and technologies proprietary. However, for many industries – like banking, telecom and internet, working across competitor lines to exchange data has enriched and expanded their reach. Health care needs to take a lesson from these industries.

Working in data silos will not improve the exchange of health data; rather, it will create friction in the industry. Patients expect their doctors to have the information they need to provide them with the best treatment. Doctors struggle to access this important data outside their four walls. The industry has an opportunity to step up and make it possible for providers to access a three-dimensional view of the patient’s health history, and in turn, create a new wave of opportunities for the health IT industry.

Collaboration among Health IT Industry Players

Collaboration throughout the IT industry is essential to creating a ubiquitous nationwide interoperable Health IT Infrastructure. This focus on infrastructure will drive standard adoption and open up the gates to national record sharing. Electronic health record (EHR) vendors offering services across the healthcare continuum are a key piece of this puzzle, which is why CommonWell formed to join forces with all health IT stakeholders dedicated to the vision that patient data should be accessible no matter where care occurs.

Collaboration with the public sector is also crucial. The government plays a strong role in narrowing the technical standards in health IT, but the bar must be raised on leveraging real-world data exchange. Additional ONC activities are complementary to the existing Federal Advisory Committees (FACAs) as noted below:

Continue Reading

ResearchKit: A Valuable tool for Researchers, but with Limitations

Guest post by Kalisha Narine, technical architect, Medullan.

Kalisha Narine
Kalisha Narine

In March 2015, Apple announced the next big thing for the scientific community: ResearchKit. According to Apple, the new application would help researchers gather more data, more frequently, and more accurately than ever before, all by utilizing the more than 94 million iPhones in use in the U.S. today as a strategized recruitment channel.

In a nutshell, ResearchKit makes it easier for researchers to create iOS apps for their own research, focusing on three key things: consent, surveys, and active tasks. ResearchKit provides communication and instruction for the study, in addition to pre-built templates for surveys that can be used to collect Patient Reported Outcomes. Plus, ResearchKit can collect sensor data (objective patient activated outcomes) on fitness, voice, steps, and more, all working seamlessly within Apple’s HealthKit API, too, which many users have on their devices already. This allows researchers to access relevant health and fitness data (passive patient outcomes).

ResearchKit-powered apps like MyHeart Counts, Share the Journey, Asthma Health, GlucoSuccess and mPower have shown us that people want to do their part in advancing medical research by sharing their data with researchers committed to making life-changing discoveries that benefit us all.

Five months after its launch, I’d say, in no exaggerated terms, that ResearchKit has proven to be game-changing for researchers, leapfrogging patient reported outcome studies into a “mobile first” world. However, the current framework certainly doesn’t cover the full gamut of what is needed to build a patient-centered, engaging, scaleable digital outcomes solution. If you’re planning piloting a solution around ResearchKit, here’s what you need to know:

ResearchKit offers up important benefits for medical researchers, especially when it comes to recruitment capability and the speed at which researchers can acquire insightful data to speed medical progress.

The MyHeart Counts app has been arguably the most successful example of ResearchKit use to date — it’s a great example of the recruitment capabilities provided by ResearchKit. In just 24 hours, the researchers from MyHeart Counts were able to enroll more than 10,000 patients in the study. Then they clocked an unprecedented 41,000 consented participants in less than six months (even before entering UK and Hong Kong markets). As most researchers know, recruitment can be one of the biggest challenges in building a study. But with ResearchKit, scientists are able to grow their number of participants into the thousands very quickly; it would have taken the MyHeart Counts researchers a year and 50 medical centers around the country to get to 10,000 participants.

Additionally, ResearchKit also increases the speed at which researchers are able to find the insights they’re looking for. This is mostly because people use their mobile devices constantly (most Americans clock more than two hours per day), which means that the accumulation of mass amounts of subjective (surveys), objective (sensors/active tasks) and passive (background) data happens quickly. The Asthma Health app is a great example of this, as it combines data from a phone’s GPS with information about a city’s air quality and a patient’s outcomes data, all to help patients adhere to their treatment plans and avoid asthma triggers — study participants told researchers that the app was also helping them better understand and manage their condition. The app is also assisting providers in making personalized asthma-care recommendations.

Continue Reading

The Shift to Patient-Centric Care and the Implications of Value-Based Medicine

Guest post by Brian Irwin, VP of Strategy, SHYFT Analytics.

Brian Irwin
Brian Irwin

The days of life science companies focusing on physicians and medical professionals as their main “customers” are numbered. Larger healthcare market trends, better access to real world data and increasing costs are driving the shift to a new customer – the patient.

As total spending on medicines globally reaches the $1 trillion level annually, payers are, not surprisingly, placing greater emphasis on ensuring they are receiving value for their investments. Payers are bringing closer scrutiny to all parts of the healthcare system they are funding. Patients, too, are spending more in today’s system and in turn are changing their expectations about value and outcomes. As a result, we are seeing a new focus on real world evidence as payers and patients seek proof that the medicines are contributing to improved patient outcomes, reduction in hospital admissions or re-admissions, and more efficient use of resources.

In response, life science companies are seeking new and more effective ways to leverage data and analytics across the clinical-commercial continuum and to adapt their go-to-market strategies to reflect their focus on patient outcomes. Capturing results-oriented data and making it usable is critical for this new model to work and for the patient to receive the most appropriate care.

With this shift to outcomes-based results and real world evidence, many questions arise around the data and analytics. Who defines “value” and what does success look like? Is it long-term value or short-term results? Additionally, how should this information be distributed to and evaluated by the various stakeholders? How do we achieve a level of consistency when data sources are not yet fully interoperable?

As the pharma industry starts to work through the answers to these questions and begins to redefine go-to-market strategies and commercial models, effective utilization of data and analytics will prove to be one of the greatest competitive advantages.

Defining Value in the New Healthcare Era

This focus on patient outcomes has broader implications for the industry with data sources now being aligned with the patient to enable decisions across the enterprise including drug development, market access, and commercial performance.

Continue Reading

Practice Models Perspective: Concierge, Private Pay, Traditional Fee for Service

The latest report from the Physicians Foundation suggests that physicians working in independent practices continues to decline slightly. When this and other surveys are reviewed, about half of physicians are employed and half are owners or co-owners of private practices. Of those who are owners, one survey found that more than 70 percent would prefer not to sell their practice.

Despite this ongoing trend towards employment, many physicians believe this is going to turn around. And if the industry is going to manage costs and improve outcomes, more private practices may be needed. According to an article in the New York Times, the cost of providing care in the hospital setting is considerably more expensive than the care provided in the smaller practice setting.

One of the ways physicians are finding to stay independent is through the use of private pay, or membership, models. Recently, the American Academy of Private Physicians and Kareo conducted the largest industry survey on physician perspectives on practice models. It showed that about 24 percent of providers have already fully changed or incorporated in some way a concierge, direct pay or membership model in their practice and another 46 percent are considering a similar change in the coming three years.

This infographic highlights some of the other key discoveries made in this industry-first survey.

infographic-practice-model-perspectives

How Organizations Meet Compliance Demands with Smart Technology

Guest post by Chris Strammiello, Vice President of Global Alliances & Strategic Marketing, Nuance.

Chris Strammiello
Chris Strammiello

The growing use of smart devices at the point of care exacerbates the dual, yet contradictory, challenges confronting hospital IT directors and compliance officers: Making patients’ health information easier to access and share, while at the same time keeping it more secure.

A major problem is that there are just too many touch points that can create risk when sharing protected health information (PHI) inside and outside of the hospital. In addition to securing communications on cell phones, tablets and laptops, these tools can send output to smart multi-function printers (MFPs) that not only print, but allow walk-up users to copy, scan, fax and email documents. This functionality is why the Office of the National Coordinator for Health Information Technology now defines MFPs as workstations where PHI must be protected. These protections need to include administrative, physical and technical safeguards that authenticate users, control access to workflows, encrypt data handled on the device and maintain an audit trail of all activity.

Accurate, Effective and Secure Use of Patient Information at Point of Care

Hospitals need to adopt an approach that automatically provides security and control at the smart MFP from which patient information is shared and distributed. This approach must also support the use of mobile computing technologies, which are helping to bring access to patient information and electronic health records (EHR) to the point of care. Advanced secure information technology and output management solutions can help hospitals protect patient health information as part of achieving HIPAA-compliant use of PHI with software by adding a layer of automated security and control to both electronic and paper-based processes. These solutions can minimize the manual work and decisions that invite human error, mitigate the risk of non-compliance and help hospitals avoid the fines, reputation damage and other costs of HIPAA violations and privacy breaches.

With this approach, vulnerabilities with capturing and sharing PHI are reduced with a process that ensures:

Continue Reading

How to Avoid an ICD-10 Claims Disaster

Guest post by Michele Hibbert-Iacobacci, CMCO,CCS-P, vice president of information management support, Mitchell International.

Michele Hibbert-Iacobacci
Michele Hibbert-Iacobacci

Recent ICD-10 end-to-end testing conducted by CMS and the American Medical Association yielded an 87 percent claim acceptance rate. This means that of the 29,286 test claims received, only 25,646 were accepted. Imagine thousands of claims denied due to providers submitting improper codes, stalling the bill review process and creating pain for everyone involved.

The one percentage drop in claims accepted from the month prior is a poor indication for the months ahead, as 100 percent of the test participants claimed to be fully ready for the October 1 ICD-10 implementation date. It’s only logical that many carriers and providers wonder what will happen to those that aren’t ready.

The end-to-end testing results also raise complicated questions: Which ICD-10 codes will be seen most frequently post-implementation date? And will these codes match what providers will put on the bills they send?

It’s no secret that ICD-9 has a lot fewer codes than ICD-10 and a situation is simply less complicated with less contextual data to worry about. As a result of the influx of new codes presented by ICD-10, we can expect to see providers assigning way more codes than necessary to bills in a pin the tail on the donkey-type attempt to choose the right code. For example, there are ten codes for a fracture to the tibia in ICD-10, as opposed to one in ICD-9. So many options may lead a provider to place all ten on a bill, to ensure payment is received.

It will be challenging for untrained providers to submit the correct ICD-10 codes, and as such, productivity will decrease alongside increasing reimbursement challenges and potential claim denials. Carriers, on the other side, will be forced to conduct extensive reviews of each bill to determine the actual injury cause and appropriate code.

To handle the huge influx of ICD-10 codes, providers can design a system where the office coders are provided with quick references to the most prevalent codes used in the practice. Over time, the overall billing experience will improve as coders become more skilled in identifying proper codes and carriers become more precise in reviewing bills. At first, carriers will be tolerant toward the reporting of multiple or vague codes. However, with each passing day post-implementation of ICD-10 carriers will become increasingly strict. Providers will be required to submit correct coding pending the value provided by the classification system accurately describes patient conditions.

Continue Reading