Guest post by Lucy Doyle, Ph.D., vice president, data protection, information security and risk management, McKesson, and Karen Smith, J.D.,CHC, senior director, privacy and data protection, McKesson.
Today there are opportunities and initiatives to use big data to improve patient care, reduce costs and optimize performance, but there are challenges that must be met. Providers still have disparate systems, non-standard data, interoperability issues and legacy data silos, as well as the implementation of newer technologies. High data quality is critical, especially since the information may be used to support healthcare operations and patient care. The integration of privacy and security controls to support safe data handling practices is paramount.
Meeting these challenges will require continued implementation of data standards, processes, and policies across the industry. Data protection and accurate applications of de-identification methods are needed.
Empowering Data Through Proper De-Identification
Healthcare privacy and security professionals field requests to use patient data for a variety of use cases, including research, marketing, outcomes analysis and analytics for industry stakeholders. The HIPAA Privacy Rule established standards to protect individuals’ individually identifiable health information by requiring safeguards to shield the information and by setting limits and conditions on the uses and disclosures that may be made. It also provided two methods to de-identify data, providing a means to free valuable de-identified patient level information for a variety of important uses.
Depending on the methodology used and how it is applied, de-identification enables quality data that is highly useable, making it a valuable asset to the organization. One of the HIPAA- approved methods to de-identify data is the Safe Harbor Method. This method requires removal of 18 specified identifiers, protected health information, related to the individual or their relatives, employers or household members. The 18th element requires removal of any other unique characteristic or code that could lead to identifying an individual who is the subject of the information. To determine that the Safe Harbor criteria has been met, while appearing to be fairly straightforward and to be done properly, the process requires a thorough understanding of how to address certain components, which can be quite complex.
The second de-identification method is the expert method. This involves using a highly skilled specialist who utilizes statistical and scientific principles and methods to determine the risk of re-identification in rendering information not individually identifiable.
We need to encourage and support educational initiatives within our industry so more individuals become proficient in these complex techniques. At McKesson, we are educating our business units so employees can better understand and embrace de-identification and the value it can provide. This training gives them a basic understanding of how to identify and manage risks as well as how to ensure they are getting quality content.
Embracing Social Media and New and Improved Technologies
One of the challenges we face today in de-identifying data is adapting our mindset and methodologies to incorporate new emerging technologies and the adoption of social media. It is crucial to understand how the released data could potentially be exposed by being combined with other available data. New standards are needed.
While de-identifying data can be challenging and complex, the task is made easier when we remember and adhere to our core directive to safeguard data. With this in mind incorporating new technologies is part of an ongoing process of review.
When done properly, de-identification enables high quality, usable data, particularly when the expert method is used. De-identification should not be viewed as an obstacle to data usage, but rather as a powerful enabler that opens the door to a wealth of valuable information.
For decades, the use of paper medical records was the “norm” and the sharing of those records with another provider typically involved a photocopier and a briefcase for the patient to carry them to the next doctor. Today, electronic medical records are becoming the standard, but the exchange of health data between disparate networks and software systems has remained elusive.
While some data exchange is taking place in health care today, it’s only occurring in isolated pockets, typically within one region or health system, making it largely ineffective. Solving this challenge requires transparency, collaboration and innovation for continued success–attributes CommonWell Health Alliance embodies.
Transparency across the Industry
Competition in almost every sector thrives on keeping information separate and technologies proprietary. However, for many industries – like banking, telecom and internet, working across competitor lines to exchange data has enriched and expanded their reach. Health care needs to take a lesson from these industries.
Working in data silos will not improve the exchange of health data; rather, it will create friction in the industry. Patients expect their doctors to have the information they need to provide them with the best treatment. Doctors struggle to access this important data outside their four walls. The industry has an opportunity to step up and make it possible for providers to access a three-dimensional view of the patient’s health history, and in turn, create a new wave of opportunities for the health IT industry.
Collaboration among Health IT Industry Players
Collaboration throughout the IT industry is essential to creating a ubiquitous nationwide interoperable Health IT Infrastructure. This focus on infrastructure will drive standard adoption and open up the gates to national record sharing. Electronic health record (EHR) vendors offering services across the healthcare continuum are a key piece of this puzzle, which is why CommonWell formed to join forces with all health IT stakeholders dedicated to the vision that patient data should be accessible no matter where care occurs.
Collaboration with the public sector is also crucial. The government plays a strong role in narrowing the technical standards in health IT, but the bar must be raised on leveraging real-world data exchange. Additional ONC activities are complementary to the existing Federal Advisory Committees (FACAs) as noted below:
Bipartisan legislation known as the 21st Century Cures Act—which passed in the U.S. House of Representatives this past July—includes mandates that systems be interoperable by the end of 2017 or face reimbursement penalties.
In early October, the Office of the National Coordinator for Health IT (ONC) released its final draft of “Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap.” The three overarching themes of the roadmap including:
Giving consumers the ability to access and share their health data
Ceasing all intentional or inadvertent information blocking
Adopting federally-recognized national interoperability standards.
The ONC has also formed a Health IT Policy Committee and Health IT Standards Committee which each include smaller workgroups that meet to ensure ongoing collaboration among the private and public sector. In fact, many members of CommonWell and other private-sector interoperability initiatives participate and/or lead these committees and workgroups.
The days of life science companies focusing on physicians and medical professionals as their main “customers” are numbered. Larger healthcare market trends, better access to real world data and increasing costs are driving the shift to a new customer – the patient.
As total spending on medicines globally reaches the $1 trillion level annually, payers are, not surprisingly, placing greater emphasis on ensuring they are receiving value for their investments. Payers are bringing closer scrutiny to all parts of the healthcare system they are funding. Patients, too, are spending more in today’s system and in turn are changing their expectations about value and outcomes. As a result, we are seeing a new focus on real world evidence as payers and patients seek proof that the medicines are contributing to improved patient outcomes, reduction in hospital admissions or re-admissions, and more efficient use of resources.
In response, life science companies are seeking new and more effective ways to leverage data and analytics across the clinical-commercial continuum and to adapt their go-to-market strategies to reflect their focus on patient outcomes. Capturing results-oriented data and making it usable is critical for this new model to work and for the patient to receive the most appropriate care.
With this shift to outcomes-based results and real world evidence, many questions arise around the data and analytics. Who defines “value” and what does success look like? Is it long-term value or short-term results? Additionally, how should this information be distributed to and evaluated by the various stakeholders? How do we achieve a level of consistency when data sources are not yet fully interoperable?
As the pharma industry starts to work through the answers to these questions and begins to redefine go-to-market strategies and commercial models, effective utilization of data and analytics will prove to be one of the greatest competitive advantages.
Defining Value in the New Healthcare Era
This focus on patient outcomes has broader implications for the industry with data sources now being aligned with the patient to enable decisions across the enterprise including drug development, market access, and commercial performance.
This article is part of the “Think Further” series sponsored by Fred Alger Management. For more “Think Further” content, please visit www.thinkfurtheralger.com.
There is almost nothing I’m certain of except that life is an uncertain thing and that it seems to change a lot. Even in the most predictable of settings, even the minutest changes in detail can have a lasting and overwhelming effect on nearly everything in its atmosphere. In healthcare, a space seemingly immune to the status quo, things seem to get a whole lot more complicated. The same can be said of life and death, health and well-being. On their own, they are not so difficult to understand and often, in most cases, predictable and redundant; until the final days, of course, then things begin to get a little more complicated. When we’re fine, we’re fine. Life is good and most of our concerns seem trivial.
Then health gets involved and the minutest change in detail can send our lives in a spiral so much so that we barely recognize our place in it let alone who we are and where we belong. When such an occurrence arises, we begin to rely on beeps and buttons, software and technology in ways never before imagined for the intersection of our lives.
Clearly, the health IT landscape will be completely different five years from now. From where we stand today to where we’re headed, we’ll likely look back on this moment and wonder how we survived such archaic times. Just a couple years removed from the age of the electronic health records, technology that already seems dated and antiquated, is no longer monolithic and domineering to the space as it likely seemed in 2010.
Our future selves might stand on the threshold of 2020 and say that we were being single minded. The technology — EHRs were supposed to save healthcare and are now nothing but foundational. The technology was supposed to simply aggregate information collection, provide for the ability to quickly share information system wide and around the world; and give us the capability accessing all of a patient’s information at the tips of the proverbial finger.
When the promise of those solutions faded (yes, their stars have faded) and as our attention forced us into new technologies (primarily because of consumers’ desire) we are now seeing developments in technology creating touch points that impact patients “where they live” and has become the new force behind healthcare technology.
Consumers will drive healthcare’s future. Probably not a secret at this point, but a point that is hard for the old guard. They’ve had enough of being left out of the ownership process regarding their own health. They’re tired of being locked out of their own records, and kept access to their own information. Such data would not exist without those helping produce it. New consumer technologies have and will further level the field. Consumer tech will continue to spur innovation, at light speeds. Data will flow between healthcare parties and its consumers; HIPAA protections will be waived and open access for the social good will become the norm. Standard and traditional approaches when dealing with patients, in a generation or so, will be completely different and far less segmented, as they are now.
The Centers for Medicare & Medicaid Services (CMS) and Office of the National Coordinator for Health Information Technology (ONC) today released final rules that simplify requirements and add new flexibilities for providers to make electronic health information available when and where it matters most and for health care providers and consumers to be able to readily, safely, and securely exchange that information. The final rule for 2015 Edition Health IT Certification Criteria (2015 Edition) and final rule with comment period for the Medicare and Medicaid Electronic Health Records (EHRs) Incentive Programs will help continue to move the health care industry away from a paper-based system, where a doctor’s handwriting needed to be interpreted and patient files could be misplaced.
“We have a shared goal of electronic health records helping physicians, clinicians, and hospitals to deliver better care, smarter spending, and healthier people. We eliminated unnecessary requirements, simplified and increased flexibility for those that remain, and focused on interoperability, information exchange, and patient engagement. By 2018, these rules move us beyond the staged approach of ‘meaningful use’ and focus on broader delivery system reform,” said Dr. Patrick Conway, M.D., M.Sc., CMS deputy administrator for innovation and quality and chief medical officer. “Most importantly we are seeking additional public comments and plan for active engagement of stakeholders so we take time to get broad input on how to improve these programs over time.”
HHS heard from physicians and other providers about the challenges they face making this technology work well for their individual practices and for their patients. In recognition of these concerns, the regulations announced today make significant changes in current requirements. They will ease the reporting burden for providers, support interoperability, and improve patient outcomes. Providers can choose the measures of progress that are most meaningful to their practice and have more time to implement changes to program requirements. Providers are encouraged to apply for hardship exceptions if they need to switch or have other technology difficulties with their EHR vendor. Additionally, the new rules give developers more time to create user-friendly technologies that give individuals easier access to their information so they can be engaged and empowered in their care.
As part of today’s regulations, CMS announced a 60-day public comment period to gather additional feedback about the EHR Incentive Programs going forward, in particular with the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), which established the Merit-based Incentive Payment System and consolidates certain aspects of a number of quality measurement and federal incentive programs into one more efficient framework. We will use this feedback to inform future policy developments for the EHR Incentive Programs, as well as consider it during rulemaking to implement MACRA, which we expect to release in the spring of 2016.
Health IT’s most pressing issues may be so prevalent that they can’t be contained to a single post, as is obvious here, the third installment in the series detailing some of the biggest IT issues. There are differing opinions as to what the most important issues are, but there are many clear and overwhelming problems for the sector. Data, security, interoperability and compliance are some of the more obvious, according to the following experts, but those are not all, as you likely know and we’ll continue to see.
Here, we continue to offer the perspective of some of healthcare’s insiders who offer their opinions on health IT’s greatest problems and where we should be spending a good deal, if not most, of our focus. If you’d like to read the first installment in the series, go here: Health IT’s Most Pressing Issues and Health IT’s Most Pressing Issues (Part 2). Also, feel free to let us know if you agree with the following, or add what you think are some of the sector’s biggest boondoggles.
The healthcare industry has undoubtedly become a bigger target for security threats and data breaches in recent years and in my opinion that can be attributed in large part to the industry’s movement to virtualization and the cloud. By adopting these agile, effective and cost-effective modern technological trends, it also widens the network’s attack surface area, and in turn, raises the potential risk for security threats.
We actually conducted some research recently that addresses evolving security challenges, including those impacting the healthcare industry, with the introduction of cloud infrastructures. The issue is highlighted by the fact that the growing popularity of cloud adoption has been identified as one of the key reasons IT and security professionals (57 percent) find securing their networks more difficult today than two years ago.
Paul Brient, CEO, PatientKeeper, Inc. No industry on Earth has computerized its operations with a goal to reduce productivity and efficiency. That would be absurd. Yet we see countless articles and complaints by physicians about the fact that computerization of their workflows has made them less productive, less efficient and potentially less effective. An EHR is supposed to “automate and streamline the clinician’s workflow.” But does it really? Unfortunately, no. At least not yet. Impediments to using hospital EHRs demand attention because physicians are by far the most expensive and limited resource in the healthcare system. Hopefully, the next few years will bring about the innovation and new approaches necessary to make EHRs truly work for physicians. Otherwise, the $36 billion and the countless hours hospitals across the country have spent implementing electronic systems will have been squandered.
Email security is one of healthcare’s top IT issues, thanks, in part, to budget constraints. Many healthcare organizations have already allocated the majority of IT dollars to improving systems that manage electronic patient records in order to meet HIPAA compliance. As such, data security may fall to the wayside, leaving sensitive customer information vulnerable to sophisticated cyber-attacks that combine social engineering and spear-phishing to penetrate organizations’ networks and steal critical data. Most of the major data breaches that have occurred over the past year have been initiated by this type of email-based threat. The only defense against this level of attack is a layered approach to security, which has evolved beyond traditional email security solutions that may have been adequate a few years ago, but are no longer a match for highly-targeted spear-phishing attacks.
Dr. Rae Hayward, HCISPP, director of education and training at (ISC)²
Dr. Rae Hayward
According to the 2015 (ISC)² Global Information Security Workforce Study, global healthcare industry professionals identified the following top security threats as the most concerning: malware (77 percent), application vulnerabilities (74 percent), configuration mistakes/oversights (70 percent), mobile devices (69 percent) and faulty network/system configuration (65 percent). Also, customer privacy violations, damage to the organization’s reputation and breach of laws and regulations were ranked equally as top priorities for healthcare IT security professionals.
So what do these professionals believe will help to resolve these issues? Healthcare respondents believe that network monitoring and intelligence (76 percent), along with improved intrusion detection and prevention technologies (73 percent) are security technologies that will provide significant improvements to the security posture of their organizations. Other research shows that having a business continuity management plan involved in remediation efforts will help to reduce the costs associated with a breach. Having a formal incident response plan in place prior to any incident decreases the average cost of the data breach. A strong security posture decreases not only incidents, but also the loss of data when a breach occurs.
Guest post by Steve Tolle, chief strategy officer and president of iConnect Network Services, Merge Healthcare.
Sooner than later, payers will demand meaningful interoperability to determine the true cost of quality healthcare outcomes. While they may not have a preference for which electronic health record (EHR) platform a doctor or health system uses, they will understand that a platform’s ability to communicate with other EHR platforms will affect the cost and quality of the care provided.
Payers are already implementing bundled payments for some types of costly care, such as full hip replacements. Conventional assumptions aside, physician fees and facility charges are not the leading drivers of joint replacement cost variability. Instead, wide cost disparities frequently seen between Joint Replacement Procedure A and Joint Replacement Procedure B are the product of unpredictable charges for supplies, anesthesia, and medical imaging. When payers start bundling reimbursements for common procedures, risk will shift to providers who will be challenged to closely manage cost fluctuations. In preparation for this transition, healthcare organizations must proactively assess their imaging strategies to keep their business running smoothly, continue providing quality patient care, and ensure they maximize revenue for the services they deliver.
What Providers Must Evaluate
Medical imaging is a $100 billion industry that drives $300 billion in healthcare spending. It accounts for nearly eight percent of U.S. healthcare spending, according to the Journal of the American College of Radiology — a costly component of care that must be effectively addressed as the industry readies itself for the shift from volume to value-based reimbursement.
The U.S. Department of Health and Human Services recently set an ambitious goal that by 2016, 85 percent of healthcare payments will be tied to quality and value of care. Successful healthcare organizations will need to manage two key factors closely — appropriateness and efficiency.
CMS and private payers will increase their vigilance around quality measures such as readmission rates and unnecessary diagnostic imaging. Medically unnecessary or redundant imaging is already on Medicare’s radar, showing up in legislation that mandated decision support for imaging and extended the deadline for ICD-10 conversion. If providers begin to correct course now, downstream risk of lost revenue and decreased patient satisfaction can be mitigated, if not avoided.
Take Stock of Current Assets
To stay ahead of the curve, providers should evaluate all aspects of their image management programs. Many are looking for new solutions that simplify and digitize outdated, paper-based procedures for patient orders, automate insurance payment authorization, and move images from point A to point B in real time, regardless of file format.
It has only been about two generations since traveling medicine shows were common forums for medical information. Phony research and medical claims were used to back up the sale of all kinds of dubious medicines. Potential patients had no real method to determine what was true or false, let alone know what their real medical issues were.
Healthcare has come a long way since those times, but similar to the lack of knowing the compositions of past medical concoctions and what ailed them, today’s digital age patients still don’t know what is in their medical records. They need transparency, not secret hospital –vendor contracts and data blocking, like the practices being questioned by the New York Times. One patient, Regina Holliday resorts to using art to bring awareness to the lack of patient’s access to their own medical records.
There are many reasons patients want access. Second opinions, convenience, instant access in a medical emergency and right of ownership—I paid for them, I own them. Other reasons patients need to view their records is for accuracy and validity. Inaccurate record keeping has even caused the EHRI Institute to cite incorrect or missing data in EHRs and other health IT systems as the second highest safety concern in its annual survey, outlining the Top Ten Safety Concerns for Healthcare Organizations in 2015.
Healthcare system executives, from CIOs to CEOs are very aware of the increasing requirements from patients asking for their records and the various state and federal laws that come into play. However, they are also aware that by making it too easy for patients to access records they risk liability and HIPAA issues. They also don’t want to provide documents that can easily enable cost comparisons or raise questions about charges.
Riding the wave of interest in accessing personal medical records are organizations like Get My Health Data. Org. The organization was founded in June 2015 as a collaborative effort among leading consumer organizations, healthcare experts, former policy makers and technology organizations that believe consumer access to digital health information is an essential cornerstone for better health and better care, coordinated by the National Partnership for Women & Families, a non-profit consumer organization. On July 4 it launched #DataIndependenceDay to create awareness for the HIPAA law which states that patients must be granted access to their health information with very few exceptions. An update to those laws that was finalized in 2013 extends these rights to electronic health records.
Despite the introduction of personal health records (PHRs), Blue Button technology and product introductions from blue chip technology leaders, such as Microsoft and Google, there has been no significant, unifying technology to ignite pent up demand for their medical records by consumers. This lack luster interest and ongoing interoperability issues might be the unifying force to drive many consumers to consider Personal Health Information Exchanges (PHIEs) as an alternative to EHRs and Health Information Exchanges (HIEs) that unnecessarily duplicate data and risk HIPAA violations.
Will PHIEs Ignite the Patient Record Access Movement?
Frost & Sullivan, in its research report, “Moving beyond the Limitations of Fragmented Solutions Empowering Patients with Integrated, Mobile On-Demand Access to the Health Information Continuum”, identifies personal health information exchange (PHIEs). They are described as providing individual patients, physicians, and the full spectrum of ancillary providers with immediate, real-time access to medical records regardless of where they are stored by using an open API.
The PHIE can provide access to the entirety of an individual patient record, regardless of the number of sources or EHR systems in which the patient data resides. This technology is made possible through fully interoperable integration servers that can access any EHR system with available APIs and portray the integrated data in a viewable, secure and encrypted format on a mobile device.
By leveraging the powerful simplicity of open APIs, PHIE technology can also access medical records in a way that is much more comprehensive than the closed EMR portals commonly used by doctors’ offices. Despite their pervasive use, these portals are cumbersome and expensive for patient’s use. The portals also include the same lack of interoperability that plagues hospital EHR systems.
Information collaboration is not new, but there is an increasingly critical need for effective collaboration to create an efficient healthcare ecosystem. How can healthcare organizations design collaborative frameworks that allow them to successfully manage vast amounts of data and create actionable information from that data?
One of the first, and perhaps most important steps, is to understand why it’s imperative to foster an environment that encourages and promotes information collaboration. The amount of data companies use to track performance can be overwhelming, and many companies are inclined to abandon their quest to connect results across the organization, particularly when redundant data sources conspire across the enterprise to prevent a single source of truth. While it’s easy to understand why it happens, this approach can inhibit an organization’s ability to address the management of its data.
The implementation of an effective, collaborative framework is imperative. Not only does it have short-term impact, like enforcing consistent data quality and use, but it also improves business results. Active collaboration can lower management costs and enhance an organization’s ability to analyze and interpret information over the long haul and:
Improves the user experience for management, physicians, nurses, and patients
Reduces data inaccuracies and redundancies
Offers impact analysis across the organization for change requests
Enhances the organization’s ability to drive insight analysis and action creation based on a balanced information asset environment
Furthermore, if a healthcare organization chooses not to enforce a collaborative information framework, there may be financial consequences. Inconsistency in reporting can lead to noncompliance with provisions of the Affordable Care Act, which require uniform reporting and analytics.
What’s really at stake? One example of the potential damage insufficient collaboration can create is evidenced in North Highland’s recent work with a healthcare organization that wanted to improve its management environment. During the onboarding process, new requirements and analytics were captured and documented. The requirements were then coded, tested and deployed prior to completion of the data loads. The missing component was an understanding of the impact that those changes had to legacy analytics in use across the client’s enterprise. As a consequence of no established metadata environment, changes made to the analytical applications impacted numerous analytics and reporting downstream- resulting in massive rework. North Highland worked to remediate data continuity throughout the effected systems and establish a scalable metadata framework to grow analytical capabilities with the future needs of the organization.
This example underscores why it is imperative to implement a model that addresses and improves interoperability, collaboration, and information knowledge, which eliminates a significant amount of risk and creates a highly effective information collaboration governance program.
One successful model that can be followed to address this issue emphasizes a business first approach and requires that there is an overall culture of collaboration both internally and externally. It is based on three fundamental disciplines and supported with metadata foundation that connects and maintains the relationships between the three disciplines. The model’s pillars include:
The Business Information Discipline (BID), which defines the data and information necessary to support the operational goals of each department, business unit or division within the organization. Roles and responsibilities within the BID vary based on the size of the organization, but certain roles are central to the discipline.
The Systems and Network Discipline (SND), which defines the system roles and network analyst roles within the information environment. This is the discipline with the largest concentration of technical resources, each having a degree of stewardship responsibility.
The Information Asset Discipline (IAD), which is responsible for creating, maintaining and delivering information asset management for the organization. IAD outlines governance implementation, stewardship roles and general management of information, enabling impact analysis across the organization.
Organizations must respond to healthcare laws and adapt business models to comply with necessary requirements, all the while continuing to monitor reform-related legislative changes and regulatory guidance. By following the outlined framework, healthcare organizations can create an unbreakable foundation that ensures consistency in data and enhances enterprise agility.
Interoperability standardization is a critical element of e-health innovation. Varied and complimentary standards will be needed in the decades ahead to accelerate development of life-saving and life-enhancing capabilities and to ensure that the greatest potential benefits of e-health are realized around the world.
The IEEE Standards Association is at the tip of this spear and is working to make progress in the field of medical device interoperability. Illustrating the benefit of interoperability of standards in the monitoring and assessment of patients, and creating awareness of current medical device standards, this following infographic gives a short description of six medical device standards that strive to improve medical treatment for both patients and providers.
Utilizing smart phones, people can track and send their data to doctors from a variety of personal health devices such as, sleep monitors, insulin pumps, sleep apnea equipment, health and fitness monitors, health weighing scales and glucose meters. The standards specify the use of specific term codes, formats and behaviors in telehealth environments that promote interoperability of multi-vendor products to create an over-all more seamless tracking of health.