Tag: interoperability

Industry Professionals Unify Effort To Streamline Addiction Treatment

Image result for zencharts logoAs the addiction epidemic continues to plague even the smallest of communities throughout the country, substance abuse treatment specialists from coast to coast have embarked on a unified effort to raise the tide to improve the continuum of care for individuals struggling to break the cycle.

The national interoperability committee has been making strides over the last year to ensure that regardless of a patient’s unique individual treatment history, care providers are able to efficiently receive seamless access to the complete detailed medical records necessary to begin helping create lasting and effective care.

The committee effort is spearheaded by ZenCharts co-founder Dan Callahan, a 36-year veteran of the behavioral health care industry.

“It’s not uncommon for a patient to go through rehab five or more times — I’ve seen some with over 20 — and communicating information from each of those episodes can be where things start to fall apart,” Callahan said. “Clinicians need the tools to help make the right decisions. If they have all the data, they can see what the patient went through — what was the length of detox? What things were tried, how were they tried and were they were successful?”

With the fragmentation of EHR systems across the country, and privacy laws, sharing medical records can be a significant hangup. When that happens, it puts the burden on the patient to bring a new provider up to speed.

“We’re making changes in the industry as a whole, and helping push the boundaries for how we can help these people,” Callahan said. “We need to help clinicians meet and work with patients where they currently are, and know more accurately where they’ve been.”

Continue Reading

Intelligent and Connected Healthcare Begins With Paperless Fax

By Amy Perry, director of product marketing, OpenText.

Amy Perry
Amy Perry

The pace of digital transformation today is increasing rapidly, with more industries jumping on the bandwagon to adopt new technologies which recast workflows. New solutions powered by artificial intelligence and machine learning are enabling machines to handle processes once cumbersome to employees.

In fact, the rate of this shift is so pronounced that according to Deloitte, the average digital transformation budget has increased by 25 percent over the past year, from $11 million to $13.6 million. More than half of mid-sized and large companies are spending more than $10 million on these efforts.

While this is a trend impacting almost every industry, it presents unique challenges to the healthcare sector. One of the most important challenges digital transformation extends to healthcare professionals is in the area of interoperability. As the sheer amount of health-related data, along with the ways to transmit and store this data, continues to increase, the ability of healthcare organizations to juggle the free flow of information between the patient’s care team and the patient is becoming more vital. At the same time, healthcare providers must ensure the highest levels of patient data privacy.

Unsurprisingly, most healthcare providers are preparing for this challenge. According to a new survey of healthcare IT professionals conducted by OpenText in conjunction with IDG Research, 85 to 94 percent of healthcare organizations are either actively investing or are planning to quickly invest in interoperability infrastructure to provide more intelligent and connected healthcare. While this intent is a great starting point, the journey can still be challenging for organizations of every size.

Ensuring a more free flow of information between providers to enhance the patient experience while simultaneously adhering to HIPAA’s privacy mandates may initially seem impossible to many teams. A wider embracement of paperless fax solutions across the industry could provide a data-centric solution which allows organizations to further interoperability goals while also ensuring that patient privacy remains paramount.

Paperless fax gains momentum

The evolution to fax stems from HIPAA guidelines mandating all patient information be securely stored and communicated. Tools such as email lack essential regulatory compliance and must be shelved in favor of other forms of communication, such as secure fax. While paper-based fax has become almost obsolete in other industries, it is still heavily used in healthcare despite causing some roadblocks to efficient communication. Paper-based fax requires a labor-intensive process that results in limited access to patient information at the point of care and slower care coordination between providers. Though these shortcomings are widely recognized among healthcare professionals, nearly half of patient information is still being transmitted by paper-based fax.

Findings from the same survey confirm momentum in paperless fax technologies. According to survey respondents, 50 percent of all medical communications continues to be done via some form of fax, but paperless faxing surpasses paper-based faxing in terms of medical communications volume. Among this, a significant majority of the survey respondents showed favorability to paperless faxing because of its digital integration capabilities.

Seventy-six percent of respondents either agreed or strongly agreed with the statement that they are happy with their current paperless faxing method because it’s integrated with their electronic medical record (EMR), back-end system, or other applications. By integrating digital faxing with EMR, document management systems, and clinical applications, a paperless fax solution becomes the most connected device in an organization, optimizing patient information exchange, reducing costs, and increasing productivity.

The catalyst for future patient information exchange

In addition, a favorable attribute to paperless faxing is that it provides a much more secure form of patient information exchange and surpasses the requirements of HIPAA’s Protected Health Information privacy rule. As new interoperability tools based on standards for the secure transmission of patient records are considered across many healthcare organizations, health providers can leverage their existing paperless fax solution to transition to modern, secure, and interoperable exchanges of patient documentation that are integrated across systems and applications.

Ultimately, the study’s findings show technology has reversed the death knell many initially thought had struck the fax industry. In fact, instead of being a siloed or time-consuming way to share information, new paperless fax technologies are helping eliminate these inefficiencies by shortening the time it takes to get patient information to the right provider and facilitating faster access to critical information at the point of care. Implementing a cloud-based delivery system is an attractive step as organizations move to the adoption of digital transformation. Healthcare providers must modernize legacy systems and embrace these new technologies to stay at the forefront of the industry and meet patients’ growing expectations.

Interoperability Barriers: Achieving It In Today’s Healthcare Data Landscape

By Drew Ivan, EVP of product and strategy of Rhapsody.

Drew Ivan
Drew Ivan

It was generally recognized by 2009 that the health care industry was long overdue when it came to adopting electronic systems for storing patient data. At the time, hospital adoption of electronic health record (EHR) systems was at about 10 percent while electronic record keeping was commonplace in most other industries. EHR technology was widely available, yet doctors and hospitals were still using paper charts.

The HITECH Act of 2009 was part of a broader stimulus package that financially nudged hospitals and eligible professionals to adopt and use EHRs. The meaningful use incentive program began a national, decade-long project to adopt, implement, and optimize EHR software. The program was a huge success, judged by the most obvious metric, EHR adoption. Today, nearly 100 percent of hospitals are using electronic health records. This means that records are safe from physical damage, far easier to analyze and report on, and – in theory at least – easier to transfer from one provider to another.

However, when viewed through the lens of return on investment, the success is less impressive. The federal government has spent $36 billion to encourage providers to adopt EHR systems but the industry has spent far more than that to procure, implement and optimize the software. Yet, hospitals are seeing reduced productivity, doctors face a huge documentation burden, and interoperability remains an unsolved problem. The first two problems are the consequence of workflow changes brought on by the EHR systems, but interoperability roadblocks ought to have been eliminated by implementing EHR systems, so why is it still so difficult to transfer records from one provider to another, or from a provider to the patient?

Health IT experts generally consider three categories of obstacles to interoperability:

  1. Business disincentives: allowing medical records to move to a different provider makes it easier for patients themselves to move to another provider, and helping customers switch health care providers is contraindicated by usual business practices (even though HIPAA states that patients are entitled to receive copies of their medical records and may direct copies of their records to be sent elsewhere.)
  2. Technical challenges: Meaningful use set a fairly low bar for cross-organizational data exchange requirements, and it did little to ensure that EHR systems could understand data sent from another system. Although these problems are largely resolved today, there is still the impression that “interoperability is a hard technical problem”.
  3. Network effects: point-to-point connections between providers are impractical, but the network approach also has its drawbacks. The assortment of HIEs and national interoperability initiatives is huge and confusing, and it’s not obvious which network(s) an organization should join.

There may have been an assumption that when medical records moved from paper to electronic format they would immediately become more interoperable, but by 2016, the level of interoperability was far below what patients and regulators expected. As a result, the 21st Century Cures Act of 2016 was passed by Congress and signed into law by the outgoing Obama administration. The law’s scope included a number of health care priorities, including a patch for the interoperability gap left by Meaningful Use. Cures explicitly forbids providers, technology vendors, and other organizations from engaging in “information blocking” practices.

Earlier in 2019, the Office of the National Coordinator for Health IT (ONC) issued a notice of proposed rulemaking (NPRM) that defined exactly what is (and what is not) meant by “information blocking.” Once adopted, the expectation will be that a patient’s medical records will move according to the patient’s preferences. Patients will be able to direct their data to other providers and easily obtain copies of their data in electronic format.

Continue Reading

The Future of Healthcare Is Innovation, Not Excel Sheets and Check Boxes

By Abhinav Shashank, co-founder and CEO, Innovaccer

Abhinav Shashank

U.S. healthcare is nowhere near what technology made us dream of a decade back. Healthcare technology was meant to act as a means of reducing costs, eliminating burnout, and making care delivery patient-centric. Cut to today, where a broken leg can cost a patient as much as $7,500, seven out of 10 physicians do not recommend their profession to anyone, and we rank poorly among other developed countries in terms of the number of preventable deaths.

Why did technology fail?

While disruptive technology solutions did flood healthcare in the last couple of decades, many of them required physicians to go the extra mile to comprehend those sophisticated systems. Today, physicians are still crunching large data files day in and day out, nurses are doubling up as technical executives, and patients are perplexed by the fact that their providers hardly have time for them.

It’s time for technology to care

If a technology solution is not assisting organizations in improving care quality, reducing costs, and optimizing utilization levels, then its very relevancy is questionable. Healthcare organizations need technologies that can help them actuate their data, realize their strategic goals, and bring patients closer to their providers.

Health IT solutions should make the lives of providers easier. Any health IT solution that puts an additional burden on providers is unjustified and unacceptable. Providers are not data analysts, and expecting them to train tirelessly to understand an IT system and spend a couple of hours each day navigating through complex interfaces can drastically reduce physician-provider time and pave the way for physician burnout.

In with ultimate integration. We need to bring together EHRs, PHMs, payer claims and HIEs and put it all in the palm of the providers’ hands. Whether it’s quality management or data management, it should be simple.

In with relevant insights right at the point of care. Providers are tired of wading through complicated EHRs and excel sheets. What we need now is to seize the nanosecond and realize truly automated care delivery that helps boost the clinical outcomes.

In with 100 percent transparency and bi-directional interoperability. Healthcare providers are often forced to access bits and pieces of electronic healthcare analytics and referrals on disparate applications. Physicians need to capture real-time care gaps, coding opportunities, patient education opportunities, and more; the only problem is that they don’t know how exactly to accomplish this. Providers should be able to capture the gaps in patient care right when they need to and enhance the patient experience of care.

In with true patient-centric care. Healthcare is not just providing episodic care to patients, it is about building relationships with them. In a world where the quality of care directly influences the financial success of an organization, providers should look forward to aligning the needs of their patients to their treatment procedures.

Healthcare of the 2020s needs reliable data activation platforms

“If you can’t explain it simply, you don’t understand it well enough.” — Albert Einstein

Buzzwords like innovation, intelligence, and analytics make sense in today’s time; however, unless the user experience is seamless, the charisma of back-end development does little good for healthcare professionals.

We’re moving into an age of intelligence, and in this age, successful organizations do one thing right- they know the worth of their data. This is the same thing that we need to do in healthcare. Organizations have to switch from a makeshift approach to engage patients and find a concrete strategy that is suited to their advantage, but this needs to be done with the support of data.

Continue Reading

The Importance of a Nursing Data Framework To Achieve Consistent Quality In Health Information Exchange

By Dr. Luann Whittenburg, PhD, RN, FAAN.

Dr. Luann Whittenburg

With more than 4 million nurses, the largest segment of the U.S. healthcare sector, nurses have indisputably demonstrated an ability to improve healthcare outcomes. We are just beginning to utilize Healthcare IT data and AI to improve patient outcomes. One of the key benefits of AI will be the ability to leverage the data from nursing care plans and nursing diagnoses to perform work load balancing for nursing staff. This is a key solution to future management of the problem of the shortage of nurses.

Another problem that needs attention is the possible disconnects which can result from nurse to nurse hand offs with the use of virtual nurses who remotely monitor patients. They enter data into their own EHR system – not the same one in use by the hospital where the patient is located. We will discuss here the nature of the data, technologies and frameworks, the nursing information model and the structure of the data elements needed to provide care needed to implement solutions for staffing, interoperability and workflow improvements.

The National Academy of Medicine’s committee background report on the Future of Nursing 2020-2030, Activating Nursing to Address Unmet Needs in the 21st Century, found the worsening health profile in the United States requires “more than a traditional medical response.” As professionals in the care team, nursing documentation requires a standardized framework to achieve consistent data quality in healthcare communications about the work of nurses. This standardized framework recognized for professional nursing documentation is the American Nurses Association (ANA) Nursing Process. This ANA framework is essential to nurses for managing and improving healthcare outcomes, safety and reimbursement as proposed by the Institute for Healthcare Improvement (IHI).

In most electronic health information record systems, the standard nursing data implemented (sometimes called the system terminology, data dictionary, or nomenclature) is proprietary with a pre-existing data structure/framework. The proprietary framework acts as a barrier to nursing documentation by constraining the available concepts for nursing documentation and the nursing care plan fields.

Without interoperable electronic data concepts available for documentation, nursing care notes become unstructured free-text and are not included in coded health information exchanges. Due to the highly structured design of EHR systems, nursing practice is determined by the system’s terminology and ontology framework configuration. If nurses do not select the ANA framework; nursing care data takes on the sedentary shape of the local proprietary data structures, rather than nesting in a flexible, portable and universal tool to enable nurses and other episodic care providers to improve future nursing interventions, practice and care outcomes.

The American Nurses Association (ANA) describes the common nursing framework of the documentation of professional nursing practice as the Nursing Process. The Nursing Process is the foundation for the documentation of nursing care. Yet, in the EHR, nursing documentation is reused during the patient’s stay, over and over, with the documentation being done from the nursing assessment as if the documentation was a template. The Nursing Process is the framework and essential core of practice for the registered nurse to deliver holistic, patient-focused care.”

Producing effective EHR systems for nursing requires a deep understanding of how nurses create and conduct cognitive documentation as well as task-oriented documentation. Most EHR systems dictate rather than adapt to nursing workflows and nursing information is not organized to fit the ANA model of care. The EHRs often assume a nursing care delivery model that is represented as algorithmic sequences of choices, yet nursing care is iterative with reformation of patient goals, revising interventions and actions and updating care sequences with individual patients based on encountered condition changes and constraints. In the dictated workflow of EHRs, nursing data is collected as care assessments with nursing diagnoses, interventions and actions in formats used to create single patient encounters.

Continue Reading

Two-Thirds of U.S. Healthcare Providers Are Behind the Curve In Digital Health Transformation

Nearly two-thirds of healthcare providers rate themselves as being behind the curve on their digital health initiatives, citing clinician resistance and interoperability of legacy systems with digital/mobile technologies as the top barriers, according to new research from Unisys Corporation.

On behalf of Unisys, HIMSS surveyed 220 IT decision makers/influencers at U.S. hospitals and health systems and asked them to rank their organization based on how they are leveraging digital and mobile technologies to improve the patient experience, lower the cost of care delivery and improve clinician/staff efficiencies. They were then rated as being ahead of the curve (early adopters/early majority) or behind the curve (late majority/laggards).

Of those surveyed, 64 percent rated themselves as being behind the curve, including 20 percent who were rated as laggards. Notably, only 11 percent of organizations were rated as early adopters when it came to adoption and implementation of digital technologies.

When asked about the barriers to advancing digital health initiatives, “behind the curve” respondents cited challenges starting with clinician resistance to adopting new solutions (51 percent) and difficulties integrating legacy systems with new digital/mobile technologies (50 percent). Availability of skilled IT staff (48 percent) and the identification/remediation of cybersecurity threats (45 percent) were also highly cited as challenges.

Jeff Livingstone
Jeff Livingstone

“These survey findings cannot be taken lightly, as we believe that being on the high end of the digital health continuum is positively correlated with reduced costs, improved efficiencies and most importantly, improved patient outcomes,” said Jeff R. Livingstone, PhD, vice president and global head, life sciences and healthcare, Unisys. “The survey also demonstrates that healthcare information technology needs to adopt modern technology platforms that have interoperability, transparency and efficiency at their core. Legacy healthcare systems do not easily meet these objectives and are costly to implement and operate.”

The survey also looked at the key initiatives that digital health technologies support. Notably, only 16 percent of laggards had a comprehensive data governance plan, and only nine percent of laggards said their organization was able to successfully apply data to determine the best course of action, compared to 83 percent and 78 percent of early adopters, respectively. Additionally, only 13 percent of laggards said that their medical devices could securely communicate with electronic health records.

“Access to real-time data is critical to healthcare providers and patients today. In many cases, providers can work with trusted vendors capable of offering both healthcare information technology and security strategies to help them better utilize and protect their patients’ data,” said Livingstone.

For more information on the study results, please click here.

HIMSS Provides Insight Into ONC/CMS Proposed Rules, and Shares Possible Responses – Kind Of

By Scott E. Rupp, publisher, Electronic Health Reporter.

Image result for himss logoOn March 21 HIMSS representatives vice president of government affairs, Tom Leary, and senior director of federal and state affairs, Jeff Coughlin, hosted a roundtable with members of the media to peel back a few layers of the onion of the newly proposed ONC and HHS rules to explain some of the potential ramifications of the regulations should they be approved.

The CMS proposed regulation is attempting to advance interoperability from the patient perspective, by putting patients at the center of their health care and confirming that they can access their health information electronically without special effort.

ONC’s proposed regulation calls on the healthcare community to adopt standardized application programming interfaces (APIs) and presents seven reasonable and necessary conditions that do not constitute information blocking.

According to HIMSS’s assessment of both proposals there’s room for interpretation of each, but the organization has not yet fully formed a complete response to each as of this writing.

Tom Leary

However, Leary said: “It’s important to emphasize that all sectors of the healthcare ecosystem are included here. The CMS rule focuses on payer world. The ONC rule touches on vendors and providers. All sectors really are touched on by these rules.”

With both, ONC and CMS is trying to use every lever available to it to push interoperability forward and is placing patients at center, Coughlin said. The healthcare sector got a taste of how CMS plans to empower patients through its recent MyHealthEData initiative, but the current proposal places more specifics around the intention of agency. Likewise, the ONC rule is attempting to define the value of the taxpayer’s investment in regard to the EHR incentives invested in the recent meaningful use program.

Key points of the rules

Some key points to consider from the rules: APIs have a role to play in future development of the sector and are seen as a real leveler of the playing field while providing patients more control of their information, Coughlin said.

HHS is focusing on transparency and pricing transparency. For example, there’s movement toward a possibly collecting charge master data from hospitals and, perhaps, publishing negotiated rates between hospitals and payers, which HHS is looking into.

Jeff Coughlin

What happens now that rules are out? According to HIMSS, education members is the first step to understanding it and responding to the federal bodies. “What we’ve done is focus on educating HIMSS members in briefings,” Coughlin said. “Trying to get early feedback and early impressions from members, convening weekly conference calls to address parts of the rule. Once we have critical mass then we work with executive leadership to make sure what we are hearing from membership to is reflected across the membership.”

Looking into the future?

For health systems, the broad exchange of data likely remains a concern. Data exchange within the ONC rule impacts providers and health systems in a number of ways, especially in regard to the costs of compliance to meet all of the proposed requirements.

HIMSS representatives are not currently casting a look into a crystal ball or if they are (they are), they’re not yet ready to tip their hand regarding what the organization intends to pursue through its messaging on behalf of its members.

“We’re not in a place to see where we are going to land,” Coughlin said. “We are hearing from our members about the complexities of rules and what’s included. It’s hard to overestimate how complex this is. ONC and CMS in designing broader exchange of information is something that speaks very well of them, but (this is) complex in interpretation and implementation.”

Information blocking exceptions, the default is broader sharing of information across the spectrum. More information has to be shared and expectations need to be defined, they said. From HIMSS’ perspective, compliance is the primary issue of its members. The question that needs answering is what kind of burden is being placed on health systems and providers. Leary is confident HIMSS will spend a good bit of ink in its response on citing potential concerns over information blocking and what that might mean.

“It will be helpful for the community to have examples and use cases for what’s included especially for exceptions for information blocking,” Coughlin said. “We need examples to clearly define the difference between health information exchange and health information network.”

Continue Reading

A Single Source of Truth: Data Management In Healthcare

By Bill Kotraba, vice president, healthcare solutions and strategy, Information Builders.

Bill Kotraba

Data has long been a popular topic in healthcare and is even more so after this year’s HIMSS. The industry is buzzing about the joint CMS and ONC announcement, which proposes a framework to improve interoperability and support seamless and secure access of health information. The pressure is on for healthcare to tackle their data as the two organizations strive to provide patients with the ability to leverage personal information in various applications. And, this pressure will only increase as we look into the future, making it even more imperative that payers and providers address the issue now.

Beyond interoperability

Look more closely, and you will see that with their recent announcement CMS and the ONC are focusing on healthcare organizations’ ability to manage data across the enterprise. Historically, healthcare has worked from siloed applications and data sources with light integration using interface engines. Recently, healthcare organizations have pinned their hopes on leveraging data effectively through huge investments in new EHR platforms. The reality, pointed out by government officials at HIMSS in Orlando, is that this still results in significant challenges for healthcare organizations to manage information across the data value chain.

Although not part of their proposed framework, CMS and the ONC point out the need for better patient mastering across data sources. Organizations hoped their investment in a centralized EHR platform would solve this but that has proven to not be the case. In addition to patient data, healthcare organizations face challenges in mastering physician data, which can have wide impact, including on value-based care initiatives. The joint proposal also highlights that the ability to push back accurate, cleansed data to source systems is critical.

Healthcare needs a unified approach

Using FHIR to stop data blocking and push the industry towards a standards-based approach will help, but it’s not sufficient for the data challenges facing healthcare organizations. In addition to tackling the issues pointed out at HIMSS, healthcare organizations must:

Continue Reading