Tag: interoperability

Gazing Into the Crystal Ball: What 2016 Will Look Like for Health IT Consultants

Guest post by Ben Weber, managing director, Greythorn.

Ben Weber
Ben Weber

This is the time of year when people are looking into their crystal ball, and telling all of us what they see happening in the next 12 months. Some of these predictions will be wild (aliens will cure cancer!) and some will be obvious (more health apps in 2016!). But how many will be helpful?

As I gaze into my own crystal ball, I have to admit I’m also peeking at my email (I like to multi-task). I can’t really say if it’s inspired by the swirling lights of the magic orb on my desk, or if it’s because of the inquiries from clients, messages from my management team and RFPs from various hospital systems … but I also have a prediction for the New Year: 2016 will be the year of migration for Epic and Cerner consultants.

The United States healthcare industry has made great progress in EHR implementation—to the point where implementation is no longer the primary conversation we’re having. Now we’re discussing interoperability, if we’re using ICD-10 codes correctly, how and if we should integrate the data collected from wearable fitness technology, and more. Those discussions—and the decisions made as a result—will continue to require human intelligence and power, but in 2016 there will be a decreased demand for consultants on these projects. Healthcare IT professionals who have grown accustomed to this kind of work will either have to settle into full-time employment—or turn their nomadic hearts north to Canada.

Our neighbors on the other side of the 49th parallel are ramping up their EHR implementations, which is good news for consultants interested in using their passports. Implementations in the US are slowing down, and while there is still work available, it is not as constant and may not command the same hourly rates as in years past. Meanwhile, several leading Canadian healthcare IT organizations have already warned of a looming talent shortage in their country (source), the effects of which are beginning to be felt.

Epic and Cerner specialists are particularly in demand, as there is a dearth of experienced talent. Out of the Canadian healthcare IT professionals who have worked with an EMR, 28 percent report familiarity with MEDITECH, 13 percent with Cerner, and 7 percent with McKesson. Only 4 percent have worked with Epic, according to the 2015 Canadian Healthcare HI & IT Market Report.

Continue Reading

Interoperability for Real; It’s Finally Here

Guest post by Sanjeev Agrawal, president, LeanTaaS Healthcare.  

Sanjeev Agrawal
Sanjeev Agrawal

Interoperability will be healthcare IT’s biggest trend in 2016 as the industry finally sees momentous forward movement.

In fact, interoperability is not a new trend. It has been an important mission (and a challenge) for healthcare administrators for decades, but the past couple of years have been game-changing:

Continue Reading

2016: The Shakeup Year for Health IT

Guest post by Robert Williams, MBA/PMP, CEO, goPMO, Inc.

Robert Williams, MBA/PMP
Robert Williams

I continue to view 2016 as a shakeup year in healthcare IT. We’ve spent the last five plus years coming to grips with the new normal of meaningful use, HIPAA and EMR adoption, integrated with the desire to transform the healthcare business model from volume to value. After the billions of dollars spent on electronic health records and hospital/provider acquisitions we see our customers looking around and asking how have we really benefited and what is still left to accomplish.

All politics is local

Our healthcare providers are realizing their clinical applications, specifically EMR vendors, are not going to resolve interoperability by themselves. When the interoperability group, CommonWell formed in 2013 much of the market believed the combination of such significant players (Cerner, Allscripts, McKesson, Athenahealth and others) would utilize their strength to accelerate interoperability across systems. Almost three years late CommonWell only has a dozen pilot sites in operation.

Evolving HL7 standards and a whole generation of software applications are allowing individul hospitals to take the task of interoperability away from traditional clinical applications and creating connectivity themselves.

Help wanted

Black Book’s survey published last month, stated that three out of every four hospitals with more than 300 beds are outsourcing IT solutions. Hospitals have been traditionally understaffed to meet the onslaught of federal requirements. Can they evolve into product deployment organizations as well? Across all the expertise they need within the organization? Most are saying no and searching out specialty services organizations to supplement their existing expertise and staff.

Are you going to eat that?

Patient engagement is on fire right now at the federal level (thank you meaningful use Stage 3), in investment dollars and within the provider
community. But to truly manage hospital re-admissions and select chronic diseases (diabetes, obesity and congestive heart failure for example)
providers need data and trend analysis on daily consumer behavior. The rise of wearable technology and the ability to capture data/analyze data from them will be a major focus going forward. These technologies will likely help to make us healthier but with a bit of big brother side affect.

Continue Reading

5 Big Health IT Trends in 2016

Guest post by Mohan Balachandran, co-founder and president, Catalyze.

Mohan Balachandran
Mohan Balachandran

As we look back upon 2015, we can reflect, review and based on that and other factors, make some predictions about what next year will bring us. John Halamka had an interesting post that reflect on the bigger challenges, such as ICD-10, the Accountable Care Act and its implications on data analytics, the HIPAA omnibus rule and its impact on cybersecurity and audits and the emergence of the Cloud as a viable option in healthcare. We can expect to see some of these trends continue and grow in 2016. So based on these key learnings from 2015, here are a few predictions for 2016.

Cybersecurity will become even more important

In 2015, insurers and medical device manufacturers got a serious wake up call about the importance and cost of cybersecurity lapses. Healthcare data will increasingly be looked at as strategic data because we can always get a new credit card but since diagnoses cannot change, the possibilities of misuse are significant. Just as the financial industry has settled on PCI as the standard, expect the healthcare industry to get together to define and promote a standard and an associated certification. HITRUST appears to be the leader and recent announcements are likely to further cement it as the healthcare security standard. Given all that, one can safely expect spending on cybersecurity to increase.

IoT will get a dose of reality

The so-called Internet of Things has been undergoing a boom of late. However, the value from it, especially as applied to quantifiable improvement in patient outcomes or improved care has been lacking. Detractors point out that the quantified-self movement while valuable, self selects the healthiest population and doesn’t do much to address the needs of older populations suffering from multiple chronic diseases. Expect to see more targeted IoT solutions such as that offered by those like Propeller Health that focus on specific conditions, have clear value propositions, savings, and offer more than just a device. Expect some moves from Fitbit and others who have raised lots of recent cash in terms of new product announcements and possible acquisitions.

Continue Reading

Solving and Resetting Healthcare’s Largest Hurdle for 2016: Interoperability

Guest post by Steve Yaskin, CEO, Health Gorilla.

Steve Yaskin
Steve Yaskin

Electronic health records (EHRs) were supposed to transform the healthcare industry in the same way that digital technology has transformed the rest of our lives – organize and simplify. EHRs held the promise of easier access to patient health history, greater patient engagement, and improved clinical decision making and outcomes. And yet, despite the potential, electronic health records thus far have proven to be just another industry headache. Doctors contend with complicated and incompatible systems that stifle collaboration and enhanced patient care. Patients lack adequate access to their own records and methods to conveniently communicate with their care team.

While patients and doctors struggle, EHR system vendors benefit from the stagnant and uncompetitive market, charging exorbitant installation and maintenance fees, with no real incentive to innovate. It is a broken system, but it can be fixed, with the tech industry’s penchant for disruptive innovation. There is great opportunity for tech companies to develop fixes that will benefit customers and reignite development in digital healthcare.

Electronic medical records are currently locked away in walled gardens that inhibit vital information exchange between care team members and patients. These walls need to be broken down to allow for the collaboration that patients expect between their care team members. EHRs based on Software-as-a-Service (SaaS) platforms would allow vendors and medical providers to cut installation and maintenance costs, while offering genuine compatibility and simplicity. SaaS platforms are also cost efficient, with transaction-based business models that only require subscription and access fees. A SaaS health record system would be cost-effective, compatible, and ultimately serve the doctors and their patients.

Currently, one patient can have several associated identifiers from different physicians, hospitals and EHR vendors. Data is often duplicated and workflow becomes complicated for providers. An industry-wide standard could work, but there is no guarantee that a solution can be selected and implemented nationwide in a timely manner. An outside approach would offer much-needed perspective and an injection of fresh ideas into the conversation. Silicon Valley could assist by developing simpler, tech-based solutions, with industry stakeholders providing input. For instance, a master patient index, successfully driven by heuristic real-time matching algorithms, would offer similar functionality to the universal account log-ins offered by Facebook and Google and further simplify access to electronic health records.

EHRs should behave more like part of a “clinical network” that combines simplified workflows with stronger communications. Lab tests, referrals, pre-authorizations and results can be delivered instantly, retooling today’s overcomplicated systems with a more effective transactional eco-system. The network simplifies physicians’ day-to-day activities, and aggregates the collected data into an electronic health record. Tapping into the success of social and business platforms, such as Facebook Messenger and Slack, secure communication between patients and their complete care team, built around these universal health records, adds a layer of proactive care management that was previously unattainable.

Continue Reading

The Island of Misfit EHRs

Guest post by Dr. Tom Giannulli, CMIO, Kareo.

Dr. Tom Giannulli
Dr. Tom Giannulli

As any holiday TV-loving baby boomer can attest, the island of misfit toys is not a happy place. In the 1964 stop-motion animated television show, “unwanted” were destined to live out their toy lives without the joy of playtime with the child they were built to please. Unfortunately, some EHR products share certain misfit qualities which can make their use more difficult for a busy provider.

So how do you know if you are using a misfit EHR? Here are a few signs:

Sound familiar? This is essentially your situation when you have committed to an outdated and under-supported EHR system for your practice. You are land-locked by an older system that is not cloud-based or does not leverage the many cloud resources for communication and interoperability.

So, your technology is old, the code base has been put on the shelf by the EHR vendor and no updates are coming. This is despite the rapid changes surrounding your practice and the healthcare industry in general.

You feel isolated, and when you call for support you get little to no relief, as the vendor has moved on to bigger and better customers. In the TV show, Santa promised to come back to save the misfits, just as your EHR vendor promised customized support, ongoing upgrades and improved efficiency. But the costs are prohibitive and your confidence in the vendor is low.

Maybe it is time to get off the island, and hitch a ride with a new vendor. If a new EHR is on your holiday list, here some criteria you should consider:

Cloud-Based Platform

Leverage the power of the cloud to connect to labs, e-prescribing networks, HIEs and other data hubs such as the Commonwell Health Alliance. With a cloud-based EHR system these connections are built into the application, and any new features or connections to other entities become available to all users, no upgrades, no updates required to your infrastructure.

Don’t buy expensive hardware, servers and IT support staff to manage them. All you need to run a cloud-based EHR is a desktop web browser or mobile device.

Continue Reading

Don’t Let the Transition to Value-Based Care Throw Your Practice Off Course

Guest post by Alok Prasad, CEO, RevenueXL.

Though many Medicare and private payer reimbursement programs that require practices to begin moving to value-based compensation already have set sail, most small practices are still treading water near the shore when it comes to this new wave of payment models.

While admirable in their care goals, these quality care-based reimbursement programs can pose some insurmountable challenges for small providers. In fact, they require a whole new way of providing care for some practices, as well as creating new documentation of integrated data analysis, development of care coordination with other providers, payer reporting applications, and often times new technologies that can support these new provisions.

What’s more, all this change also can be quite expensive for small practices and wreak havoc on current business practices.

Set the course
No doubt about it, though, the move to value-based care is on. According to the 2015 Physician Compensation Survey, conducted by Physicians Practice magazine, 63 percent of physician compensation is currently tied to productivity; 37 percent to value metrics and 29 percent to patient satisfaction scores.

The Centers for Medicare and Medicaid Services (CMS), however, has expressed its goals of having more providers participating in value-based plans each year, with a goal of 50 percent by 2018. And it has further incentivized physician participation by specifying increasing reductions in payments for non-participation that began in 2013.

So unless they want to start leaving money on the table, practices have no choice but to take the plunge into such new compensation programs.

Lift the Anchor
Before diving in and potentially draining money and resources to participate in such programs, physicians need to look around and assess their current situation to determine how the new reimbursement model might work in their practice. For example, they need to evaluate current technology, vendors, resources and physician support to determine what changes they need to make, as well as what internal infrastructure they can use.

Continue Reading

Good Samaritan’s EHR System Integrated with HIE Creates Healthcare Interoperability

Guest post by Thanh Tran, CEO, Zoeticx, Inc.

Thanh Tran
Thanh Tran

The long awaited road to true healthcare IT system interoperability is being implemented at Good Samaritan in Indiana, enabling the 232-bed community healthcare facility to better deliver on its commitment to delivering exceptional patient care.  The system will also enable the hospital to substantially increase their practice’s revenue while containing healthcare system integration costs.

“We strive to be the first choice for healthcare in the communities that we serve and to be the regional center of excellence for health and wellness,” said Rob McLin, president and CEO of Good Samaritan. “We are proud to be the first hospital in the country to implement this great integrated health record system that will allow us to provide a much higher level of continuity of care for our patients, as they are our top priority.”

The integration is being made possible with Zoeticx’s Patient-Clarity interoperability platform that will integrate WellTrackONE’s  Annual Wellness Visit (AWV) patient reports with Indiana’s Health Information Exchange (IHIE) and the hospital’s Allscripts EHR. IHIE is the largest HIE in the US, serving 30,000 physicians in 90 hospitals serving six million patients in 17 states.

Revenue Generator for the Hospital

WellTrackONE and Zoeticx will enable patient’s AWV data to flow from the application to Allscripts EHR and the IHIE system. With Zoeticx’s Patient-Clarity platform and WellTrackONE’s software, the healthcare IT integration passes on increased revenue from the Centers of Medicare & Medicaid Services (CMS) and decreased IT costs for medical facilities.

Medicare pays medical facilities $164.84 for each initial patient visit under the AWV program and $116.16 for each additional yearly visit. With the AWV integration in place, the hospital is now able to meet CMS’s stringent requirements for patient reimbursements.

It is estimated that the Good Samaritan will be able to generate $500 to $1,200 per AWV patient from follow up appointments for additional testing and referrals for approximately 80 percent of the Medicare patients that are flagged by the AWV for testing, imaging and specialty referrals within the hospital.

This subscriber number is expected to trend upwards into 2050 and will create billions in new healthcare revenue through the US as the population ages.  The hospital is not charged any costs for the system until it is reimbursed by CMS.

Overcoming Healthcare System Limitations

The hospital began offering Medicare’s AWV’s a few years ago, but had to develop its own tracking protocols, which impacted its budget and staff resources. The system it had created also operated poorly, allowing hospital staff to only view about 10 percent to 15 percent of patient data.

Good Samaritan medical teams were also constrained by interoperability, having to enter new illness findings and other medical info manually and fax PDFs to other facilities where they would have to again be entered into a different system. The hospital also had all of the data contained in WellTrackONE and Allscripts’ system, but no way to integrate the two, let alone achieve that integration with IHIE. Providers were also spending valuable patient face time trying to find specific patient data buried in the EHR system.

“Our systems were working fine, independently of each other,” said Traci French, director of business development and revenue integrity. “But we could not achieve true interoperability between the two systems. The best we could do was basically reshuffling PDF documents. The next challenge was to integrate that data with the exchange. We needed to get data to providers where they needed it, when they needed it.”

Continue Reading

Reducing the Negative Side of Prior Authorization

Guest post by Robert S. Oscar, R.Ph. CEO and president, RxEOB.

Robert Oscar
Robert Oscar

Prior authorization exists to reduce drug costs, to manage appropriate brand medication prescribing, and to curb medication abuse. Despite its good intentions, this extra step to determine whether or not a drug is appropriate for a patient’s symptoms has gained a reputation of inconvenience for both physicians and consumers.

In a 2013 study by SUNY Upstate Medical University, it was revealed that U.S. primary care physicians and their office staff have experienced significant increases in time consumption as a result of prior authorization and its associated requirements. For consumers, hours can be wasted waiting to find out whether or not they are allowed a particular prescription under the conditions of their health plan.

Reducing this negative aspect of prior authorization is paramount for the betterment of overall health costs and medication adherence. By streamlining the time spent between medical record lookup and prescription delivery, healthcare organizations and consumers can begin to experience more efficient prior authorization. If efforts made toward better big data advancements, mobile health (mHealth) and health IT are prioritized, doctors can confirm drug eligibility faster to help their patients recover faster.

Below are five reductions that can come from implementing electronic prior authorization (e-PA):

Reduced Labor Costs: When a doctor pulls up a patient’s medical records he must sift through numerous data points to determine which drugs are approved and which drugs are going to require prior authorization. The hours spent processing this data is costly for healthcare staffing, but lost time can be reduced by moving the process online and implementing electronic methods. This can allow physician offices and PBMs the ability to review, submit and determine authorization almost immediately.

Reduced Consumer Delays: A consumer will typically experience the unattractive side of prior authorization at the pharmacy. If a doctor issues a prescription without knowing the patient’s medication history or pushes a popular name brand drug without suggesting a generic, the consumer will likely get sidelined with prior authorization processing at the point of sale. Having an e-PA process that can review and determine which drugs a patient is already approved for before they head to the pharmacy can reduce customer wait times and greatly increase consumer satisfaction.

Continue Reading

Releasing the Power of Big Data through Proper De-Identification

Guest post by Lucy Doyle, Ph.D., vice president, data protection, information security and risk management, McKesson, and Karen Smith, J.D.,CHC, senior director, privacy and data protection, McKesson.

Today there are opportunities and initiatives to use big data to improve patient care, reduce costs and optimize performance, but there are challenges that must be met. Providers still have disparate systems, non-standard data, interoperability issues and legacy data silos, as well as the implementation of newer technologies. High data quality is critical, especially since the information may be used to support healthcare operations and patient care. The integration of privacy and security controls to support safe data handling practices is paramount.

Meeting these challenges will require continued implementation of data standards, processes, and policies across the industry. Data protection and accurate applications of de-identification methods are needed.

Empowering Data Through Proper De-Identification

Healthcare privacy and security professionals field requests to use patient data for a variety of use cases, including research, marketing, outcomes analysis and analytics for industry stakeholders. The HIPAA Privacy Rule established standards to protect individuals’ individually identifiable health information by requiring safeguards to shield the information and by setting limits and conditions on the uses and disclosures that may be made. It also provided two methods to de-identify data, providing a means to free valuable de-identified patient level information for a variety of important uses.

Depending on the methodology used and how it is applied, de-identification enables quality data that is highly useable, making it a valuable asset to the organization. One of the HIPAA- approved methods to de-identify data is the Safe Harbor Method. This method requires removal of 18 specified identifiers, protected health information, related to the individual or their relatives, employers or household members. The 18th element requires removal of any other unique characteristic or code that could lead to identifying an individual who is the subject of the information. To determine that the Safe Harbor criteria has been met, while appearing to be fairly straightforward and to be done properly, the process requires a thorough understanding of how to address certain components, which can be quite complex.

The second de-identification method is the expert method. This involves using a highly skilled specialist who utilizes statistical and scientific principles and methods to determine the risk of re-identification in rendering information not individually identifiable.

We need to encourage and support educational initiatives within our industry so more individuals become proficient in these complex techniques. At McKesson, we are educating our business units so employees can better understand and embrace de-identification and the value it can provide. This training gives them a basic understanding of how to identify and manage risks as well as how to ensure they are getting quality content.

Embracing Social Media and New and Improved Technologies

One of the challenges we face today in de-identifying data is adapting our mindset and methodologies to incorporate new emerging technologies and the adoption of social media. It is crucial to understand how the released data could potentially be exposed by being combined with other available data. New standards are needed.

Closing Thoughts

While de-identifying data can be challenging and complex, the task is made easier when we remember and adhere to our core directive to safeguard data. With this in mind incorporating new technologies is part of an ongoing process of review.

When done properly, de-identification enables high quality, usable data, particularly when the expert method is used. De-identification should not be viewed as an obstacle to data usage, but rather as a powerful enabler that opens the door to a wealth of valuable information.