Category: Editorial

Why Chip and PIN Is Obsolete … And So Are Fingerprints

Guest post by Simon Crouch, assistant brand manager, Hitachi Europe Ltd.

Simon Crouch
Simon Crouch

We live in a world where data and deception go hand in hand. So many everyday activities – from online shopping and banking to emailing and paying bills – are governed by passwords, profiles and personal details.

And as people’s phones, cars, and homes get smarter and more connected, the number of ways criminals can try and access and abuse your personal information is only going to rise.

Most people rely heavily on passwords to protect their information. But as quickly as organizations and financial institutions create safer and safer systems, hackers are finding smarter ways to commit cybercrime, and there are more and more cases of identity theft.

The payments landscape

For debit and credit card purchases and online banking, suppliers are making a shift from chip and PIN to contactless and app-based payment technologies, but these still have one thing in common – a thief who steals your card or phone might still be able to access your cash or personal information.

Finger vein recognition

Biometrics technology has been the focus of new innovative ways of authenticating people’s identities. Biometrics includes fingerprints, iris scanning, and facial recognition, but it’s finger vein recognition that looks set to shake up the way we secure our data.

Leading scientists at Hitachi, which patented the technology in 2005, has been developing new ways to incorporate VeinID into the everyday payments and personal data landscape.

How does it work?

The Hitachi sensor works by transmitting near-infrared light through the finger. This is partially absorbed by haemoglobin in your veins, which enables the device to capture your unique finger vein pattern profile. This is then matched with your account’s pre-registered profile to confirm your identity.

But what makes VeinID more safe and secure than other types of passwords and security options?

 Your veins are unique

 No two people, even identical twins, have the same finger vein pattern. And while most people have unique fingerprints, you leave fingerprints on objects you touch, making it possible for criminals to lift and replicate for their own use. As your veins are inside your finger, there’s no way of anyone else knowing what the pattern looks like and trying to copy it.

Relying heavily on fingerprints has caused public concern in the past. When Apple launched TouchID a few years ago, people were worried about criminals cutting off people’s fingers to gain access to their phone and personal data.

While these proved to be outlandish claims, finger vein recognition users can rest easy knowing that the VeinID sensors only work with living tissue. If your finger has been cut (or severed from your hand!) the veins collapse, meaning your unique pattern is lost. Obviously, this doesn’t prevent a determined criminal from cutting off your finger, but at least, if they do, they won’t be able to access your personal information.

Continue Reading

Risk Adjustment 101: Ignoring these Could Cost You Millions

Guest post by Abhinav Shashank, CEO and co-founder, Innovaccer.

Abhinav Shashank
Abhinav Shashank

For a long time, healthcare insurance companies used to overlook people who were likely to be high-cost. As the landscape changed with new regulations, insurance providers have started offering new policies in the individual market without identifying any pre-existing conditions while enquiring about their health status. Even so, there have been many loopholes, and every administration has and continues to aim at minimizing these gaps. The one good answer thus far: risk adjustment.

What is risk adjustment?

Risk adjustment over the years has become a key mechanism used in healthcare to predict the costs incurred and ensure appropriate payments for Medicare Advantage plans, Part D plans, and health plans. Historically, it was only used in Medicaid and Medicare but lately has been an actuarial tool to ensure that health insurance plans have adequate funding and no financial hindrance in providing care to high-risk, high-need patients. Insurance companies and their plans are compared on the basis of quality and services they offer, providing a strong foundation to value-based purchasing.

Why is risk adjustment so important?

Risk adjustment advocates fair payments to health insurance plans by judging them on their efficiency and encouraging the provision of high-quality care. Beyond that, here’s why risk adjustment is important:

How is risk adjustment used in healthcare?

Healthcare risk adjustment methodologies can be used to account for changes in severity and case mixes for patients over time. Risk adjustment has been critical in reducing “cherry picking” among health plans. Dimensions of risk in care can broadly be categorized into three categories:

It’s important to ensure that by providing incentives to enroll high-cost individuals, there are necessary resources available to provide efficient and effective treatment to the relatively healthy population without overcompensation. The methodology used to risk-adjust premiums varies on the following:

On the macro level, unless the state combines its individual and small group markets, separate risk adjustment systems operate in each market. The Department of Health and Human Services (HHS), developed a risk adjustment methodology, where individual risk scores are assigned to each enrollee. The diagnoses are grouped into a Hierarchical Condition Category (HCC) and are assigned a numerical value which is averaged to calculate the plan’s average risk score. Payments and charges are calculated by comparing each plan’s average risk score to a baseline premium.

Continue Reading

The Importance of Keeping Up with Your Medical Records

Guest post by Saqib Ayaz, co-founder, Workflow Optimization.

Saqib Ayaz
Saqib Ayaz

Thanks to technological innovation, more and more healthcare facilities are now adopting the use of electronic health records (EHRs). Patients now have more opportunities to consult with their physicians about their medical records. Increased access to EHRs also means that providers will now be able to easily share patient information with other providers. The goal of increasing access to medical records is to improve the continuity of care, as well as enhance patient safety.

As more patients are able to access their records, they can impact the accuracy of the information contained within by asking questions about their medical information, by identifying inaccuracies in the information and also by giving additional information that may be useful in improving the correctness of the data. Incorporating feedback from the patients themselves implies that patients indeed do play a crucial role in improving the quality of information in their medical records.

The rewards of keeping up with your medical records are quite obvious.

First, it is the best way to ensure that your physician understands what you communicate to them. It is also a good way for the doctors to ensure that they understand what you communicate. Even though the benefits are clear, many people are often reluctant to request for their medical records. Worse still, countless individuals out there do not know that they can. Every individual is entitled to complete access to their chart from any medical facility that has ever dispensed care.

Not only are you obligated to share more information with your doctors, the information that you give makes a difference in how you respond to the treatment prescribed. Accurate information improves your chances of complying to the therapies prescribed successfully, which will consequently allow you to recover and heal in the shortest time possible.

What is contained in your medical records?

There is a difference between your official medical records and the scribbled notes that are typically handed to you after a consultation. Most scribbled notes simply contain a generic outline of your symptoms and a short prescription often written in a code that many individuals cannot understand. These, are not your medical records.

Your official medical records contain all the juicy details of your medical journey; your lab results, physician’s notes, the past and present allergic reactions and reactions to medicines, blood pressure stats and basically anything that concretely makes up your entire health profile.

Continue Reading

Does Health Tech Innovation Reward Early Adopters?

Guest post by Edgar Wilson.

Edgar Wilson
Edgar Wilson

T-Mobile recently became the first cell phone carrier to offer free inflight Wi-Fi (in support of Wi-Fi texting, as cellular signals are still not yet allowed) to all its customers. Admittedly, this was technically on the strength of partnering with a third-party platform, Go-Go, but the carrier gets the glory of being first among its big four peers to take even this step.

In-flight Wi-Fi, Wi-Fi calling, and similar services aren’t necessarily new technology, but having support for limited internet browsing and texting, all delivered through one of the top carriers in the nation, makes for a reasonably good elevator pitch—especially if you happen to be a T-Mobile customer. But the importance of the development isn’t just the novelty of the technology or the value of the service on offer; it is planting a shining pink flag in the market and staking that claim of being “first.”

Early Adoption, Arrested Development

Being first hasn’t lost its luster yet, even in a time when consumer expectations are sometimes a generation or two ahead of current technology. Hospitals and their leadership recognize this, and so, despite uncertainty on everything from insurance market regulations to the future of EHR integration, many are taking strides to do as T-Mobile has done — and find a way to get there first on a variety of issues important to consumers. And like T-Mobile, being first doesn’t have to mean getting into the weeds of proprietary innovation and product development—although plenty of larger chains and clinics do take that route; for many hospitals, being first can be accomplished through strategic partnerships with tech-centric companies.

If there is one lesson out of Silicon Valley that has entered the American zeitgeist, it is that being the first out with something can give a company, product or even team of creatives a lot of leeway in terms of going on to iterate, improve, and generally tinker. But on the healthcare front, we see how the drive to be first—or even keep pace with the rest of the industry—can create a “hurry up and wait” situation where meaningful progress sometimes lags fanfare or technology.

That is why the top tech trends in healthcare don’t change much year to year; end users, hospital administrators, and tech developers are all still trying to figure out what works, what works best, and how to integrate new tools into the clinical workflow, the patient experience, and the regulatory environment governing it all.

That is the story of EHRs is a nutshell: a good idea, a rush to adoption (both willing and coerced), and then a lengthy period of reiteration as all stakeholders struggle to recreate or wholistically reconsider the context in which this new system can, and should, operate. But the rush to adopt first and configure later isn’t limited to high-technology in the healthcare sector; it pretty well describes the legal environment surrounding health insurance.

Industry Leadership: Being First or Being Best?

From how it affects patients to what it is still trying to influence in the provider space, the conversation about care and coverage is still shepherded primarily by fear, secondarily by outrage, and in most other respects by confusion. So it looks like we’ll be shopping the exchanges for a while longer, even under President Trump’s watch.

After gazing into the abyss that was Trumpcare, the still-evolving status quo that is Obamacare is more popular than ever. Here again, the power of being first seems to provide some residual sticking power to a law frequently and publicly dragged through the mud by people and organizations with at least as much visibility and influence as one like T-Mobile.

Continue Reading

The Healthcare IT Data Revolution: Maintaining Independence With Innovative HCIT Systems

Guest post by Scott Ciccarelli, CEO, SRS Health.

Scott Ciccarelli
Scott Ciccarelli

At the beginning of their existence, electronic health records (EHRs) were primarily used as a document management system. Now, they have realigned their objectives and value to the physicians and practices they serve, to focus on data intelligence. If specialty practices want to stay independent they need to continue to evolve, prioritize value-based care and stay profitable. Moreover, they need the right partners to help enhance operational efficiency, increase patient engagement and achieve better clinical outcomes. As such, the scope of the EHRs responsibility for the practice’s health, growth, and sustainability has increased exponentially.

How will specialty practices ensure their future? By leveraging the power of clinical and operational data in their EHR and supplemental business applications, working together within the healthcare IT (HCIT) ecosystem. Businesses across all industries analyze data to measure overall industry performance. Metrics are the foundation for any successful business and physician groups are not excluded. Metrics should be the driving force behind every major decision that will boost productivity. However, physicians are not data scientists, but by utilizing the next generation HCIT systems, they can employ technology that will streamline the decision making process.

Challenges turn into opportunities

According to the Centers for Medicare and Medicaid Services (CMS), 171,000 physicians who did not collect and use data to comply with government regulations are looking at a three percent Meaningful Use penalty in 2017. Coupled with a new focus on value-based care requirements playing a critical role in care and outcomes, upgrading their data platform and capabilities should be the number one priority to comply with new industry standards. Data driven HCIT solution providers can prepare specialty practices for these coming changes. They help collect and analyze data to ensure effective treatment plans at lower costs.

Bottom line: This helps improve patient health and satisfaction.

Today’s HCIT systems are considered business tools that help physicians analyze data and reveal insights to use for enhanced decision making. Popular “big-box” HCIT systems try to be all things to all providers, yet they are tailored to hospitals and primary care physicians—many who typically see far fewer patients in a day than specialists. This puts a major burden on specialists, who rely on different clinical and operational data to help maximize outcomes.

Specialists potentially see up to 60 patients a day – and cover surgeries, follow-ups and everything in between. Generic HCIT systems fall short in relation to appointment volume. Combined with the fact those systems make data entry inefficient, impede clinical workflows, and lack business metrics, this is the major argument for specialty-focused HCIT solutions. Some groups acquired by hospitals or health systems have not adopted the integrated systems of their new parent companies. Instead, they stay with their specialty HCIT systems—interoperable with their parent companies’ technology—because of their ability to serve existing, proven workflows.

Data insights and a workflow makeover

Specialty HCIT systems that analyze a variety of data and provide practices with the knowledge to improve their performance will deliver the best outcomes for patients and practices. Analyzing operational data provides an understanding of how to deliver the best patient care at the lowest cost, thereby delivering optimal outcomes and increasing patient satisfaction levels.

Specialists should take the opportunity to re-evaluate their EHR and determine if their goals are helped or hindered by their current HCIT ecosystem. A productivity-boosting HCIT system can harness the power of data to deliver clinical and business applications, workflows, and insight through one user interface and make compliance with reporting requirements simple and straightforward.

Continue Reading

How a Digitally Connected Network Improves Specialty Referrals

Guest post by Chris Lukasiak, senior vice president, MyHealthDirect.

Chris Lukasiak
Chris Lukasiak

In the U.S., more than a third of patients are referred to a specialist each year, and specialist visits constitute more than half of outpatient visits. Referrals are the link that make this connection between primary and specialty care. From 1999 through 2009 alone, the absolute number of visits resulting in a physician referral increased 159 percent nationally, from 41 million to 105 million. This volume and the frequency of specialty referrals has steadily increased over the years and will only continue. Yet despite this rise in frequency, the referral process itself has been a great frustration for years.

Specialty referrals are a complicated business. There are many moving parts and players that all have a crucial role to play within the process. By breaking it down and looking at exactly what a referral is, who is involved, and the challenges they face, we can then look to fix what is broken. What needs to be improved? And could there be a digital solution?

Let’s start from the very beginning by looking at the stakeholders and their unique interests and concerns.

Patient – The patient experiences a health concern and needs care to get it resolved. The primary physician doesn’t provide the full solution and refers them to a specialist with more expertise about the patient’s condition. This is where the referral occurs. Currently, the extent of the referral is the physician handing a phone number to the patient to call and schedule the appointment. It’s up to the patient to contact the specialist and follow through with the next step, which explains why 20 percent of patients never even schedule the referral appointment.

Provider –There is more than one provider involved in the referral process. First is the referring (or sending) provider and then the target (or receiving) provider. The referring physician is the provider recommending (referring) them to a specialist. The target provider is the specialist that has been recommended. For a health system or physician group, there are obvious financial and quality of care benefits associated when a patient is sent to a trusted provider within network. When patients don’t go to their referral appointment, the health system or physician group loses in several ways. First of all, they have lost control over providing comprehensive care to the patient. If a patient gets readmitted to a hospital because of their negligence to follow through on a referral appointment, the health system gets penalized for the readmission. The penalty could result in CMS withholding up to 3 percent of the funding provided to the health system. The system also suffers in terms of the perception of their quality of care. If a patient is not secured with a provider within network, they may go to a competing system.

Plan – Health plans have several important considerations when a referral happens with a vested interest on three fronts to ensure the patient goes to the target provider:

1) The health plan benefits if the patient goes to a target provider within their network. Not only will patients be directed to providers that best meet their needs, but the plan also benefits when patients are referred to the providers in their Smart Network. These providers are trusted for superior care for the patient and reduced costs for the plan.
2) When a plan member doesn’t get the care they need to maintain good health, their likelihood of having major adverse events rises dramatically. This means they will end up in the ER or needing other expensive care, which represents big costs for the health plan.
3) The current approach to referrals often results in long lead times, which makes for a poor patient experience and can increase costs.

Continue Reading

Managing Denials in the Wake of ICD-10

Guest post by Lindy Benton, president and CEO, Vyne.

Lindy Benton

The world of denials management is a constantly shifting landscape, one that has changed dramatically with the onset of ICD-10. Now more than ever, denials management requires an organizational focus with built-in workflows for prevention, monitoring and tracking of claims through the system.

In the years leading up to ICD-10, providers were apprehensive about the potential drain it would place on both resources and reimbursement.  CMS predicted that – with the onset of ICD-10 – denial rates would increase by 100 to 200 percent, days in A/R would grow by 20 percent to 40 percent and claims error rates would more than double. CMS warned that error rates could reach a high of 6 percent to 10 percent, significantly higher than the 3 percent average error rate with ICD-9.

Providers also feared cash flow problems stemming from coding backlogs, expected to increase by at least 20 percent because of the complexity of the new coding system. “A typical turnaround time for claims processing of 45 to 55 days could end up being extended another 10 to 20 days,” cited Healthcare Payer News.

And the change has been momentous. With ICD-10, the number of diagnostic codes increased from 13,000 ICD-9 codes to 68,000 ICD-10 codes. The new system challenges providers to document conditions more specifically, supporting codes with thorough and accurate medical documentation.

Despite the gravity of the change, many providers say it has been a smooth transition thus far, with minimal delays in productivity and reimbursement. But as the industry moves through this period of adjustment, providers must continue to seek opportunities to protect revenue and generate cash flow for a successful claims management strategy in the wake of ICD-10.

Organizational Approach

ICD-10 requires an organizational focus around the management, prevention and defense of denials. Denials management is no longer an effort reserved just for the revenue cycle but for all departments. For coding to complete a claim, pieces of information must be collected from multiple areas across the organization. For this reason, all departments should be educated on the part they play and how cross-department collaboration can aid the process.

In preparing providers for ICD-10, the Healthcare Financial Management Association (HFMA) noted, “Claims denials will not strictly be a matter of clarification that can be handled by a nonclinical person in the billing office. Denials will raise questions about medical necessity or the clarity of medical documentation supporting a code; such questions will require input from a physician, nurse specialists or outside expertise.”

Workflow processes are also critical as hospitals work to achieve accurate coding and get bills out the door. Technologies that streamline hand-offs between departments can help reduce bottlenecks that often delay reimbursement. A work queue keeps denials moving, assigning and tracking accountability at each checkpoint and monitoring progress to ensure no claim falls through the cracks.

Continue Reading

How Healthcare Organizations Can Reduce Their Reliance on Paper

Guest post by Chris Click, senior healthcare solutions marketing manager, Nuance Communications.

Christopher Click
Christopher Click

Many hospitals, clinics and healthcare organizations today talk about going paperless. In fact, according to a November 2016 research report from IDC, more than 40 percent of healthcare organizations report that they have a paper-reduction initiative in place.

Yet even hospitals that have achieved late-stage meaningful use status still receive and process high volumes of paper. This is especially true for important printing workflows, such as medical records, administrative files, admissions documents, prescriptions and pharmacy information. According to a recent survey by HIMSS Analytics, commissioned by Nuance, 90 percent of survey respondents reported some clinicians still use paper-based documents.

There is no escaping that healthcare organizations are committed to paper, at least for the short-term future. For instance, the IDC study found print volumes are expected to remain flat for the next two years, before beginning to decline after that time period.

When you consider that this amount of paper is expensive (both in terms of actual printing costs as well as overall document management processes), hard to track, and poses serious security and compliance risks, you may wonder why so many healthcare organizations continue to rely on paper.

To help answer the question, we’ll take a closer look at the reasons cited in the IDC report. We’ll also offer a few best practices any healthcare organization can follow now to reduce its reliance on paper to address the challenges posed by manual or paper-based workflows.

Why Paper Use Continues

According to the IDC report, the top reasons hospitals, clinics and healthcare organizations continue to use paper include incompatible document management systems or technology. This issue is most notable between the organization and outside facilities, leaving default paper processes as the best workaround.

Another reason is that many workflows still require paper documentation, most notably patient check-in/belongings forms, records requiring signatures, consent forms and more. Additionally, the majority of prescriptions and pharmacy records are still paper-based. For example, only 10 percent of responding hospitals indicated that prescriptions were electronic.

Lastly, healthcare organizations are large-scale consumers of fax technology. Hospitals report that many still receive and send up to 1,000 pages per month by fax. Interestingly, these hospitals report that while faxing may be an antiquated technology, many are behind in implementing new technology and must continue to focus on what works for them.

Continue Reading