Most of us know someone who has been diagnosed with cancer and understand first hand the tidal wave of emotions and questions that come immediately after diagnosis. One question that sticks out to providers is the seemingly simple: “How many patients have you seen who are just like me?” and perhaps even more important: “Why are you confident that I can reach the magical five year survival rate?”
Unfortunately, with systems of record like the electronic health record, neither of these questions is easy for the provider to answer. The challenge is, in today’s oncology world there is both a combination of clinical confidence based on peer-reviewed data and the artistic necessity to understand what could work based on perceived comparable patients. Oncologists do incredible work to save lives, however, there is more to be done to help support the people who are making the most important decisions at the most critical inflection points.
Meaningful data to improve cancer care
Prior to the creation of EHRs, physicians stressed that they did not have enough access to data. While data is now being stockpiled within the depths of EHRs, physicians still do not have access to everything the data has to offer. The available data in EHRs is often fragmented, disorganized, and sometimes simply incomplete, making it difficult to glean any real value from this information after it is collected.
Essentially, the EHR can be compared to a messy bureau in your bedroom. While bureaus are intended to organize your clothes — socks in one drawer, t-shirts in another, etc. — sometimes socks windup in the pants drawer. All of the valuable information and data is in the EHR, but is sometimes lost in the wrong “drawers,” making it hard for clinicians to find the important information and make sense of it to impact patient care. While physicians are doing the best they can by adding information into EHRs, technology has not caught up to allow physicians to extract insights and put that data to use.
Fortunately, with the use of outside technology, we can pull real-world data (RWD) and real-world evidence (RWE) from the EHRs. This can unlock the insights hidden within the available data and uncap the potential for improving and personalizing cancer care, while reducing overall costs.
Unlocking hidden insights
The technology available today knows how data should be arranged. It knows when something is misplaced, and knows how to make sense out of it. Through advanced algorithms and clinical input, technology can essentially sort and gather RWD from EHRs and then group together similar patients based on their own biology, disease states, and other phenotypic factors, allowing for insight into treatment plans and potential outcomes.
Inanovate, Inc., a life science company specializing in the development of blood tests for cancer and autoimmune diseases, has secured an initial closing of $3.1million on a Series C financing round.
The investment, led by South Dakota Equity Partners, Mr. T. Denny Sanford, and Sanford Frontiers, a corporate affiliate of Sanford Health, will help speed the development of Inanovate’s breast cancer blood test, which aims to identify false positives from screening mammograms and reduce costly, stressful, and unnecessary follow-up imaging.
The test is part of a larger plan from Inanovate, which also includes a second test that aims to monitor the progress of breast cancer patients through therapy and beyond, and identify a recurrence event in its earliest stage, when it may be more effectively treated and cured.
“We are excited to have secured investment that will allow our company to implement our development plan through the next 18 months,” Inanovate CEO David Ure said. “We’re pleased to have partnered with investors who share our vision for improving cancer diagnosis and treatment through technology innovation. Our partners bring both expertise and passion to our investment team as we align to the needs and goals of one of the leading hospital networks in the country.”
The most recent investment builds on a strong year for Inanovate, which included a $2 million Phase 2 SBIR grant from the National Cancer Institute, along with a licensing and collaboration agreement with Sanford Health that provides access to intellectual property relating to a set of breast cancer biomarkers, in addition to patient recruitment and sample access for Inanovate’s trials.
“Improving breast cancer care is an important goal of ours,” said Kim Patrick, chief business development officer for Sanford Health. “This protein-screening technology aims to improve the diagnosis of breast cancer and its recurrence.”
The Inanovate blood tests work by detecting antibodies in a patient’s blood that have been associated with breast cancer. Because the antibodies circulate in the blood, a simple blood draw can be evaluated to discover if the disease might be present. To analyze this blood draw, Inanovate uses their patented biomarker analysis platform: The BioID-800. The machine is compact, fully automated, fits on a bench top and uses disposable test cartridges.
“This is a highly sensitive but low-cost instrument that can recognize the presence of multiple different biomarkers from a small sample of blood in one low cost easy to use test,” Ure said.
Hardly a day goes by without some new revelation of a US IT mess that seems like an endless round of the old radio show joke contest, “Can You Top This”, except increasingly the joke is on us. From nuclear weapons updated with floppy disks to needless medical deaths, many of which are still caused by preventable interoperability communication errors as has been the case for decades.
According to a report released to Congress, the Government Accountability Office (GAO) has found that the US government last year spent 75 percent of its technology budget to maintain aging computers where floppy disks are still used, including one system for US nuclear forces that is more than 50 years old. In a previous GAO report, the news is equally alarming as it impacts the healthcare of millions of American’s and could be the smoking gun in a study from the British Medical Journal citing medical errors as the third leading cause of death in the United States, after heart disease and cancer.
The GAO interoperability report, requested by Congressional leaders, reported on the status of efforts to develop infrastructure that could lead to nationwide interoperability of health information. The report described a variety of efforts being undertaken to facilitate interoperability, but most of the efforts remain “works in progress.” Moreover, in its report, the GAO identified five barriers to interoperability.
Insufficiencies in health data standards
Variation in state privacy rules
Difficulty in accurately matching all the right records to the right patient
The costs involved in achieving the goals
The need for governance and trust among entities to facilitate sharing health information
CMS Pushing for “Plug and Play” Interoperability Tools that Already Exist
Meanwhile in a meeting with the Massachusetts Medical Society, Andy Slavitt, Acting Administrator of the Centers for Medicare & Medicaid Services’ (CMS) acknowledges in the CMS interoperability effort “we are not sending a man to the moon.”
“We are actually expecting (healthcare) technology to do the things that it already does for us every day. So there must be other reasons why technology and information aren’t flowing in ways that match patient care,” Slavitt stated. “Partly, I believe some of the reasons are actually due to bad business practices. But, I think some of the technology will improve through the better use of standards and compliance. And I think we’ll make significant progress through the implementation of API’s in the next version of (Electronic Health Records) EHR’s which will spur innovation by allowing for plug and play capability. The private sector has to essentially change or evolve their business practices so that they don’t subvert this intent. If you are a customer of a piece of technology that doesn’t do what you want, it’s time to raise your voice.”
He claims that CMS has “very few higher priorities” other than interoperability. It is also interesting that two different government entities point their fingers at interoperability yet “plug and play” API solutions have been available through middleware integration for years, the same ones that are successfully used in the retail, banking and hospitality industries. As a sign of growing healthcare middleware popularity, Black Book Research, recently named the top ten middleware providers as Zoeticx, HealthMark, Arcadia Healthcare Solutions, Extension Healthcare, Solace Systems, Oracle, Catavolt, Microsoft, SAP and Kidozen.
Medical Errors Third Leading Cause of Death in US
The British Medical Journal recently reported that medical error is the third leading cause of death in the United States, after heart disease and cancer. As such, medical errors should be a top priority for research and resources, say authors Martin Makary, MD, MPH, professor of surgery, and research fellow Michael Daniel, from Johns Hopkins University School of Medicine. However, accurate, transparent information about errors is not captured on death certificates which are the documents the Center for Disease Control and Prevention (CDC) uses for ranking causes of death and setting health priorities. Death certificates depend on International Classification of Diseases (ICD) codes for cause of death, but causes such as human and EHR errors are not recorded on them.
According to the World Health Organization (WHO), 117 countries code their mortality statistics using the ICD system. The authors call for better reporting to help capture the scale of the problem and create strategies for reducing it. “Top-ranked causes of death as reported by the CDC form our country’s research funding and public health priorities,” says Makary in a press release. “Right now, cancer and heart disease get a ton of attention, but since medical errors don’t appear on the list, the problem doesn’t get the funding and attention it deserves. It boils down to people dying from the care that they receive rather than the disease for which they are seeking care.”
The Root Cause of Many Patient Errors
Better coding and reporting is a no-brainer and should be required to get to the bottom of the errors so they can be identified and resolved. However, in addition to not reporting the causes of death, there are other roadblocks leading to this frighteningly sad statistic such as lack of EHR interoperability. Unfortunately, the vast majority of medical devices, EHRs and other healthcare IT components lack interoperability, meaning a built-in or integrated platform that can exchange information across vendors, settings, and device types.
Various systems and equipment are typically purchased from different manufacturers. Each comes with its own proprietary interface technology like the days before the client and server ever met. Moreover, hospitals often must invest in separate systems to pull together all these disparate pieces of technology to feed data from bedside devices to EHR systems, data warehouses, and other applications that aid in clinical decision making, research and analytics. Many bedside devices, especially older ones, don’t even connect and require manual reading and data entry.
Healthcare providers are sometimes forced to mentally take notes on various pieces of information to draw conclusions. This is time consuming and error-prone. This cognitive load, especially in high stress situations, increases the risk of error such as accessing information on the wrong patient, performing the wrong action or placing the wrong order. Because information can be entered into various areas of the EHR, the possibility of duplicating or omitting information arises. Through the EHR, physicians can often be presented with a list of documentation located in different folders that can be many computer screens long and information can be missed.
The nation’s largest health systems employ thousands of people dedicated to dealing with “non-interoperability.” The abundance of proprietary protocols and interfaces that restrict healthcare data exchange takes a huge toll on productivity. In addition to EHR’s physical inability, tactics such as data blocking and hospital IT contracts that prevent data sharing by EHR vendors are also used to prevent interoperability. Healthcare overall has experienced negative productivity in this area over the past decade.
Guest post by Jeff Robbins is president and CEO of LiveData, Inc.
It is no secret that many of today’s best hospitals are still enmeshed in implementing and fine-tuning new, enterprise-wide electronic health record (EHR) systems. With purchase prices in the tens or even hundreds of millions of dollars, the EHR is a focal point of bringing technology to bear on the various challenges of delivering consistent, high quality care to an increasing number of patients.
Yet many hospital administrators and caregivers are finding that the level of effort (and expenditure) isn’t moving the needle as much as was expected. It turns out that this isn’t because of any specific failing on the part of the EHR vendors. Rather, it is because of a missing layer in today’s EHR technology stack.
This missing layer, workflow management systems, is software designed to coordinate specific action, create consistency, and deliver visibility by automatically connecting caregivers with relevant tasks and information. The EHR, by necessity, is focused on creating a heads-down log of all encounters. Workflow technology adds the missing heads-up displays, alerts and analytics that help drive use of the EHR during patient encounters.
One of the more complex interventions in healthcare is surgery. The choreography involving patients, caregivers, equipment, supplies and operating rooms at a busy hospital is demanding, and the added manual data-entry burden of new EHR implementations paradoxically adds to the risk of variability.
Perioperative workflow
Workflow technology can mean many things. At the planning level, one common device is using whiteboards and Post-It notes to create a basic map of tasks. This data gathering approach is an excellent team activity, and allows many stakeholders to collaborate and share knowledge about interdependencies.
The challenge posed by the complexity can be summarized as, “Where do we go from here?!” It is tempting to picture using a computer-based workflow diagramming tool to capture and enact this diagram. While the “state diagram” is a useful technology tool, the complexity of even this small detail should help highlight why the myriad states, conditions, and rules that could be brought to bear to deliver workflow technology benefits to complex interventions is, in a word, complex.
Where do we go from here?
Workflow technology, delivered via a workflow management system, is intended to implement the workflow processes built on the activities and preferences of stakeholders. By making aspects of these human processes “executable,” via executable process models, healthcare workflow technology can provide a form of power-assist to caregivers.
But, as we have seen, creating a computerized process model of a complex process will, of necessity, have to mirror some of that complexity, or risk oversimplification and the potential for harm to patients. This model is executed or consulted, in conjunction with caregivers, when they deliver care. These executable process models are at the heart of what distinguishes healthcare workflow technology from today’s EHR. Healthcare workflow technology drives workflow to achieve the consistency and quality required by our society’s burgeoning healthcare spend.
What must be done before you walk out of the office for the last time before the stroke of midnight Jan. 1, 2015? It’s a simple question with many possible responses. Each healthcare organization, based on its needs and priorities likely has a fix what it needs to do, though, perhaps those things are not necessarily what it wants to do. Like people, the final couple weeks of the year are different for everyone and practices are no different.
So, if you’re making a list and checking it twice, here are a few suggestions that you might want to add to it to be well prepared for the new year, based on your practice’s business needs, of course.
Review the ONC Federal Health IT Strategic Plan
At Health Data Consortium, we have three must-do items before we close the door to 2014. First, we urge the health IT community to review the recently released ONC Federal Health IT Strategic Plan 2015-2020. Public comments are open until February 6, but don’t let your response get lost in the start of the year flurry. Second, we are preparing for the arrival of the 114th Congress and the opportunity to share Health Data Consortium’s public policy platform for 2015. Our platform will have an emphasis on the key issues that affect data accessibility, data sharing and patient privacy – all critical to improving health outcomes and our healthcare system overall. Finally, on January 1 we’ll be only 150 days from Health Datapalooza 2015. We are kicking off the new year and the countdown to Health Datapalooza with keynote speakers and sessions confirmed on a daily basis. We’re already making the necessary preparations to gather the innovators who are igniting the open health data revolution. As 2014 comes to a close, we look forward to hit the ground running in 2015.
Ideally, turn off not only your lights, but everything — I mean every piece of digital technology and every way digital technology can connect to your organization. That is the only way to assure there are no accidents, glitches, failures or breaches. Here are some other things you can do:
• Fill every open position you can. Have positions and people identified and include backups. The only thing worse than not having a position to fill is having one to fill and leaving it open.
• Address mobility, medical devices and patient engagement, and not just from a security perspective — this is everyone who provides access, information or uses these devices or systems.
• Address the culture and have a plan to include every individual in the organization, if the technology touches them, from BYOD to analytics to privacy to cloud storage.
IT, regardless of the industry, is ultimately about people. In healthcare, it is also about the data itself, which represents your patients. It has to be there, it has to work, it has to be secure.
— David Finn, CISA, CISM, CRISC, is a member of ISACA’s Professional Influence and Advocacy Committee, and the Health Information Technology Officer for Symantec
Despite the enormous amount of knowledge that was imparted upon you during your education, perhaps one of the most important elements of maintaining a successful practice was, most likely, barely touched upon. The concept of quality health care is not complete without a rigorous discussion of patient satisfaction. A good physician/patient relationship is a crucial element of a successful practice. The fact that patients do not complain does not necessarily mean they are satisfied with the care they are receiving.
The Necessity for a Patient Satisfaction Survey
Let’s face it: In the big picture, seemingly no matter which profession, a majority of complaints to licensing boards does not revolve around specific ‘practice-based’ issues. Instead, those complaints tend to be based on “client-expectation” issues. From this, we can make the claim that happy clients do not tend to complain. If your practice can meet your patients’ expectations then your patients will more than likely react favorably by continuing their relationship with your practice, and perhaps even recommend your practice to a friend. The best method to gauge your patients’ opinion of their experience is to ask them, and by far the most cost effective method of achieving that is by a properly constructed and thoroughly analyzed patient satisfaction survey.
The Objections to a Patient Satisfaction Survey
Certainly, there are what some see as “legitimate” objections to the patient satisfaction survey. Certainly high on that list would be the issue of cost. If your practice is a member of a medical malpractice insurance organization, it’s possible that the administration and analysis of a patient satisfaction survey is a member service, available to the practice for no additional cost. If not, there are independent consulting firms that can work with your practice to design and analyze a survey. Additional costs that would be incurred would include staff time necessary to distribute and collect the survey.
Guest post by Ken Perez, Director of Healthcare Policy and Senior Vice President of Marketing, MedeAnalytics, Inc.
Recently, Mitch Seavey, 53, became the oldest winner of the Iditarod, the most famous dog sledding race in the world. At a distance of 1,600 kilometers, the Iditarod constitutes a race of supreme endurance. In dog sledding, the dogs that are chosen to lead the sled are usually the smartest, as well as the fastest, and they are appropriately called lead dogs.
The lead dogs in the realm of Medicare ACOs are the 32 pioneer ACOs, the selection of which was announced in December 2011 with great fanfare and optimism. With the greater risks (and rewards) of the pioneer ACO Model, the pioneers were widely considered the best and the brightest, the organizations most likely to succeed as ACOs.