Releasing the Power of Big Data through Proper De-Identification

Guest post by Lucy Doyle, Ph.D., vice president, data protection, information security and risk management, McKesson, and Karen Smith, J.D.,CHC, senior director, privacy and data protection, McKesson.

Today there are opportunities and initiatives to use big data to improve patient care, reduce costs and optimize performance, but there are challenges that must be met. Providers still have disparate systems, non-standard data, interoperability issues and legacy data silos, as well as the implementation of newer technologies. High data quality is critical, especially since the information may be used to support healthcare operations and patient care. The integration of privacy and security controls to support safe data handling practices is paramount.

Meeting these challenges will require continued implementation of data standards, processes, and policies across the industry. Data protection and accurate applications of de-identification methods are needed.

Empowering Data Through Proper De-Identification

Healthcare privacy and security professionals field requests to use patient data for a variety of use cases, including research, marketing, outcomes analysis and analytics for industry stakeholders. The HIPAA Privacy Rule established standards to protect individuals’ individually identifiable health information by requiring safeguards to shield the information and by setting limits and conditions on the uses and disclosures that may be made. It also provided two methods to de-identify data, providing a means to free valuable de-identified patient level information for a variety of important uses.

Depending on the methodology used and how it is applied, de-identification enables quality data that is highly useable, making it a valuable asset to the organization. One of the HIPAA- approved methods to de-identify data is the Safe Harbor Method. This method requires removal of 18 specified identifiers, protected health information, related to the individual or their relatives, employers or household members. The 18th element requires removal of any other unique characteristic or code that could lead to identifying an individual who is the subject of the information. To determine that the Safe Harbor criteria has been met, while appearing to be fairly straightforward and to be done properly, the process requires a thorough understanding of how to address certain components, which can be quite complex.

The second de-identification method is the expert method. This involves using a highly skilled specialist who utilizes statistical and scientific principles and methods to determine the risk of re-identification in rendering information not individually identifiable.

We need to encourage and support educational initiatives within our industry so more individuals become proficient in these complex techniques. At McKesson, we are educating our business units so employees can better understand and embrace de-identification and the value it can provide. This training gives them a basic understanding of how to identify and manage risks as well as how to ensure they are getting quality content.

Embracing Social Media and New and Improved Technologies

One of the challenges we face today in de-identifying data is adapting our mindset and methodologies to incorporate new emerging technologies and the adoption of social media. It is crucial to understand how the released data could potentially be exposed by being combined with other available data. New standards are needed.

Closing Thoughts

While de-identifying data can be challenging and complex, the task is made easier when we remember and adhere to our core directive to safeguard data. With this in mind incorporating new technologies is part of an ongoing process of review.

When done properly, de-identification enables high quality, usable data, particularly when the expert method is used. De-identification should not be viewed as an obstacle to data usage, but rather as a powerful enabler that opens the door to a wealth of valuable information.

Electronic Health Records Market to Grow to $17 billion by 2017

The market for electronic health and health records (EHRs) is set to experience rapid growth over the coming years, with EMR peer group value estimated to climb from approximately $10.6 billion in 2012 to $17 billion by 2017, at a Compound Annual Growth Rate (CAGR) of 9.8 percent, according to research and consulting firm GlobalData.

The company’s new report estimates that McKesson had the largest healthcare information technology software and services revenue in 2012, with $3,300 million, placing it as the EHR market leader. McKesson is followed by Cerner and Allscripts, which achieved revenues of $2,666 million and $1,477 million, respectively.

According to GlobalData, this rapid EHR market growth is because of incentives offered under the American Relief and Recovery Act of 2009, which delivers opportunities for providers to transform unstructured, paper-based data into electronic digitized information that can be shared across the entire care industry.

Continue Reading

CommonWell Opens Up Interoperability, or Does It?

If you love drama, there may be no better time than now to be in health IT. Specifically, the CommonWell Health Alliance movement – spearheaded by vendor giants Allscripts, Athenahealth, Cerner, Greenway and McKesson — to promote health information exchange.

However, as we all know, the one giant in the room not to be invited to the dance, Epic, is crying foul.

Continue Reading

Why Don’t Vendors Partner to Build Interoperable Systems Before Mandates Force them To?

In a recent conversation with Steve Ferguson, vice president of Hello Health, he described how the company is identifying new revenue sources for practices while working to engage patients. Even though the company’s business model is one that sets it apart and helps it rival other free EHRs, like Practice Fusion, I left the conversation with him wondering why more venodrs weren’t trying the same thing as Hello Health: trying something no one in the market is trying to see, if by change, a little innovation helps pump some life into the HIT market.

Along the same lines, myself and thousands of others in HIT have wondered why systems are not interoperable and, for the most part, operate in silos that are unable to communicate with competing systems.

Certainly, there’s a case to be made for vendors protecting their footprints, and for growing them. In doing so, they like to keep their secrets close; it’s the a business environment after all and despite the number of conversations taking place by their PR folks, improving patient health outcomes comes in only second (or third) to making money.

However, let’s move closer to my point. Given the recent rumors that Cerner and McKesson are working on a joint agreement to enable cross-vendor, national health information exchange, I’m wondering: Why don’t other vendors partner now and begin to build interoperable systems.

According to the rumors, the deal, if completed, could shift the entire interoperable landscape for hospitals, physicians and patients. It would position Cerner, which has more EHR users, and McKesson, which has a strong HIE product in RelayHealth with a loyal user base, to take on Epic Systems, a leading EHR vendor.

An announcement is expected at HIMSS13.

Here’s why this is important news: Interoperability mandates are coming. Like most things, it’s really just a matter of time. Systems will be forced to communicate with other, competing systems. They should already. It’s actually a bit shocking that given the levels of reporting required of care givers, the push for access to information through initiatives like Blue Button and patient’s access to information through mobile technology that there’s not more openness in the market.

The Cerner/McKesson news is incredibly refreshing and worth a look. Two major competitors may be realizing that by partnering they’ll be better able to take on each company’s biggest competitor: Epic.

Imagine connected systems exchanging data. The thought alone would be marketable across several sectors of the healthcare landscape and the move worthy of reams of coverage, which would lead to great brand awareness for each and the change to do what all EHR companies aim for: To create thought leaders; to stand out; to set the market on its heels.

If nothing else the partner vendors would stand ahead of the pack when future interoperability mandates are enacted and will be seen as experts in the exchange game. Tongue and cheek aside, the idea really is a good one and with no one currently doing it, it’s a great opportunity for a couple of HIT companies to actually move change forward and create an environment where information can be easily exchanged across practices, across specialties and across  borders.

Then, perhaps, we’ll see a real commitment to improved patient health outcomes rather than them simply trying to improve bottom lines.