Guest post by Ken Perez, vice president of healthcare policy, Omnicell.
On May 23, the Department of Health and Human Services (HHS) released a report on individual market premium changes from 2013 to 2017 for the 39 states using the federal government’s healthcare.gov platform. The report provided a good gauge of the affordability of the ACA marketplaces.
The HHS report found that all 39 states experienced increases in individual market premiums since 2013. Average premiums rose during the four-year period by 105 percent, which translates to an average annual premium increase of $2,928. To put the 105 percent premium hike in perspective, it was more than 20 times the growth in the Consumer Price Index (CPI) and more than eight times the nation’s healthcare inflation over the same period. While 16 states had premium increases under the national average of 105 percent, 20 states had premium increases between 105 percent and 200 percent. Moreover, three states—Alabama, Alaska and Oklahoma—saw premiums triple, rising more than 200 percent.
A comparable analysis of the 11 states running their own marketplaces has not yet been conducted, but from 2016 to 2017, their average approved individual market rate increase was 19 percent, over nine times CPI growth and over five times U.S. healthcare inflation.
Multiple interrelated factors have driven these premium increases, including lower-than-expected enrollment, as estimates ranging from 12 million to 15 million people—disproportionately young and healthy—who were expected to enroll in the marketplaces by the end of 2016 did not do so. Because of the lower-than-expected enrollment and relative non-participation by the young and healthy, the marketplaces have been left with an older, sicker risk pool, producing huge losses for many health plans, in spite of the previously mentioned substantial premium increases. Consequently, in 2017, 80 insurers left the ACA marketplaces while 11 entered, yielding a net decrease of 69.
The inordinate premium inflation of the marketplaces reflects a cycle that appears to be worsening, so much so that some have described it as a “death spiral.” As health insurers exit the marketplaces, competition decreases, which naturally leads to premium hikes, as well as to a narrowing of plan choices. The higher premiums and fewer choices dissuade people from signing up or cause current enrollees to drop out, further shrinking the risk pool.
If the ACA marketplaces prove to be unsustainable then access to affordable healthcare plans for millions of Americans—regardless of the availability of premium credits—will be at risk.
During the past few months, hospital organizations have lobbied for changes to the American Health Care Act (AHCA), with core concerns about possible growth of uncompensated care resulting from increases to the uninsured population and separately, the popularity of high-deductible plans with many of the insured, which raises concerns about patients’ ability to pay their hospital bills. However, as the recent HHS report compellingly points out, hospitals should not only be worried about possible ramifications of the AHCA—the serious, fundamental weaknesses of the ACA’s health insurance marketplaces constitute a clear, present and increasing challenge to hospital finances.
Being born with a heart condition I have had a chance to see how healthcare has evolved or stagnated in innovation because of inherent risk to the bottom line. Reducing revenue, patient risk and pressure from big pharma and insurance has kept the status quo. It’s crazy to think that we can order food from our phones and yet can’t even schedule our appointments online at most physicians offices and hospitals. We have the most expensive and least effective healthcare system in the world, it’s broken so we need to fix it.
There is a lack of technology in healthcare as a whole. Think about when you go into a doctor’s office and you tell them what’s wrong or if you go to a hospital and nurses are tracking your symptoms, they still write it on a piece of paper at most hospitals and physicians offices! Well, what happens when the nurse or doctor can’t read what’s been written or worse what if that paper gets lost. To put that in perspective, hospital errors are the number three leading cause of death in the U.S.
Where there is some technology it is often difficult to use and is not standardized so if you go to an emergency room that doctor will likely have to spend time trying to get your primary care doctor on the phone to better understand how to care for you. It’s happened to me before, the ER doctors spent hours trying to track down my cardiologist to get a rundown on what medications or tests need to be run on me, all the while I was lying there in pain waiting for care. Standardization of basic medical protocols needs to happen. Even better, a shared database of all the different medical protocols and AI can run through to find the right match or machine learning like autocorrect and predictive typing on your phone.
Too much data
Today’s doctor and healthcare providers receive copious amounts of data, whether that’s from your daily activity data, your daily measurements, data from scans, DNA testing data, etc., that they must go through in order to properly diagnose a patient. Sometimes there’s too much data for the doctors to consider and so they cut bait with some of it to rank all the clutter. On top of all that data they are looking into a system to find how that data correlates with your back pain, sleep issues and whatever another symptom you are looking at then finding the proper medication for you. All of this takes time away from the doctor to properly develop a relationship with the patient and better diagnose patients problems. Let’s dive into how machine learning and AI’s can help with this.
We live in a world where data and deception go hand in hand. So many everyday activities – from online shopping and banking to emailing and paying bills – are governed by passwords, profiles and personal details.
And as people’s phones, cars, and homes get smarter and more connected, the number of ways criminals can try and access and abuse your personal information is only going to rise.
Most people rely heavily on passwords to protect their information. But as quickly as organizations and financial institutions create safer and safer systems, hackers are finding smarter ways to commit cybercrime, and there are more and more cases of identity theft.
The payments landscape
For debit and credit card purchases and online banking, suppliers are making a shift from chip and PIN to contactless and app-based payment technologies, but these still have one thing in common – a thief who steals your card or phone might still be able to access your cash or personal information.
Finger vein recognition
Biometrics technology has been the focus of new innovative ways of authenticating people’s identities. Biometrics includes fingerprints, iris scanning, and facial recognition, but it’s finger vein recognition that looks set to shake up the way we secure our data.
Leading scientists at Hitachi, which patented the technology in 2005, has been developing new ways to incorporate VeinID into the everyday payments and personal data landscape.
How does it work?
The Hitachi sensor works by transmitting near-infrared light through the finger. This is partially absorbed by haemoglobin in your veins, which enables the device to capture your unique finger vein pattern profile. This is then matched with your account’s pre-registered profile to confirm your identity.
But what makes VeinID more safe and secure than other types of passwords and security options?
Your veins are unique
No two people, even identical twins, have the same finger vein pattern. And while most people have unique fingerprints, you leave fingerprints on objects you touch, making it possible for criminals to lift and replicate for their own use. As your veins are inside your finger, there’s no way of anyone else knowing what the pattern looks like and trying to copy it.
Fingers can’t be stolen
Relying heavily on fingerprints has caused public concern in the past. When Apple launched TouchID a few years ago, people were worried about criminals cutting off people’s fingers to gain access to their phone and personal data.
While these proved to be outlandish claims, finger vein recognition users can rest easy knowing that the VeinID sensors only work with living tissue. If your finger has been cut (or severed from your hand!) the veins collapse, meaning your unique pattern is lost. Obviously, this doesn’t prevent a determined criminal from cutting off your finger, but at least, if they do, they won’t be able to access your personal information.
Guest post by Abhinav Shashank, CEO and co-founder, Innovaccer.
For a long time, healthcare insurance companies used to overlook people who were likely to be high-cost. As the landscape changed with new regulations, insurance providers have started offering new policies in the individual market without identifying any pre-existing conditions while enquiring about their health status. Even so, there have been many loopholes, and every administration has and continues to aim at minimizing these gaps. The one good answer thus far: risk adjustment.
What is risk adjustment?
Risk adjustment over the years has become a key mechanism used in healthcare to predict the costs incurred and ensure appropriate payments for Medicare Advantage plans, Part D plans, and health plans. Historically, it was only used in Medicaid and Medicare but lately has been an actuarial tool to ensure that health insurance plans have adequate funding and no financial hindrance in providing care to high-risk, high-need patients. Insurance companies and their plans are compared on the basis of quality and services they offer, providing a strong foundation to value-based purchasing.
Why is risk adjustment so important?
Risk adjustment advocates fair payments to health insurance plans by judging them on their efficiency and encouraging the provision of high-quality care. Beyond that, here’s why risk adjustment is important:
Provides a neutral field where providers and payers can be compared to their peers on the basis of their quality and efficiency.
Combining risk scores and evidence-based models with risk adjustment helps providers and care teams design post-discharge plans with intense follow-ups.
With predictive analytics, risk adjustment models may be used to capture all the dimensions of relevant patient risk.
How is risk adjustment used in healthcare?
Healthcare risk adjustment methodologies can be used to account for changes in severity and case mixes for patients over time. Risk adjustment has been critical in reducing “cherry picking” among health plans. Dimensions of risk in care can broadly be categorized into three categories:
Health status
Patient health-related behavior
Social determinants
It’s important to ensure that by providing incentives to enroll high-cost individuals, there are necessary resources available to provide efficient and effective treatment to the relatively healthy population without overcompensation. The methodology used to risk-adjust premiums varies on the following:
Patient population and their breakdown
Source of payment
Healthcare market regulation
On the macro level, unless the state combines its individual and small group markets, separate risk adjustment systems operate in each market. The Department of Health and Human Services (HHS), developed a risk adjustment methodology, where individual risk scores are assigned to each enrollee. The diagnoses are grouped into a Hierarchical Condition Category (HCC) and are assigned a numerical value which is averaged to calculate the plan’s average risk score. Payments and charges are calculated by comparing each plan’s average risk score to a baseline premium.
Thanks to technological innovation, more and more healthcare facilities are now adopting the use of electronic health records (EHRs). Patients now have more opportunities to consult with their physicians about their medical records. Increased access to EHRs also means that providers will now be able to easily share patient information with other providers. The goal of increasing access to medical records is to improve the continuity of care, as well as enhance patient safety.
As more patients are able to access their records, they can impact the accuracy of the information contained within by asking questions about their medical information, by identifying inaccuracies in the information and also by giving additional information that may be useful in improving the correctness of the data. Incorporating feedback from the patients themselves implies that patients indeed do play a crucial role in improving the quality of information in their medical records.
The rewards of keeping up with your medical records are quite obvious.
First, it is the best way to ensure that your physician understands what you communicate to them. It is also a good way for the doctors to ensure that they understand what you communicate. Even though the benefits are clear, many people are often reluctant to request for their medical records. Worse still, countless individuals out there do not know that they can. Every individual is entitled to complete access to their chart from any medical facility that has ever dispensed care.
Not only are you obligated to share more information with your doctors, the information that you give makes a difference in how you respond to the treatment prescribed. Accurate information improves your chances of complying to the therapies prescribed successfully, which will consequently allow you to recover and heal in the shortest time possible.
What is contained in your medical records?
There is a difference between your official medical records and the scribbled notes that are typically handed to you after a consultation. Most scribbled notes simply contain a generic outline of your symptoms and a short prescription often written in a code that many individuals cannot understand. These, are not your medical records.
Your official medical records contain all the juicy details of your medical journey; your lab results, physician’s notes, the past and present allergic reactions and reactions to medicines, blood pressure stats and basically anything that concretely makes up your entire health profile.
T-Mobile recently became the first cell phone carrier to offer free inflight Wi-Fi (in support of Wi-Fi texting, as cellular signals are still not yet allowed) to all its customers. Admittedly, this was technically on the strength of partnering with a third-party platform, Go-Go, but the carrier gets the glory of being first among its big four peers to take even this step.
In-flight Wi-Fi, Wi-Fi calling, and similar services aren’t necessarily new technology, but having support for limited internet browsing and texting, all delivered through one of the top carriers in the nation, makes for a reasonably good elevator pitch—especially if you happen to be a T-Mobile customer. But the importance of the development isn’t just the novelty of the technology or the value of the service on offer; it is planting a shining pink flag in the market and staking that claim of being “first.”
Early Adoption, Arrested Development
Being first hasn’t lost its luster yet, even in a time when consumer expectations are sometimes a generation or two ahead of current technology. Hospitals and their leadership recognize this, and so, despite uncertainty on everything from insurance market regulations to the future of EHR integration, many are taking strides to do as T-Mobile has done — and find a way to get there first on a variety of issues important to consumers. And like T-Mobile, being first doesn’t have to mean getting into the weeds of proprietary innovation and product development—although plenty of larger chains and clinics do take that route; for many hospitals, being first can be accomplished through strategic partnerships with tech-centric companies.
If there is one lesson out of Silicon Valley that has entered the American zeitgeist, it is that being the first out with something can give a company, product or even team of creatives a lot of leeway in terms of going on to iterate, improve, and generally tinker. But on the healthcare front, we see how the drive to be first—or even keep pace with the rest of the industry—can create a “hurry up and wait” situation where meaningful progress sometimes lags fanfare or technology.
That is why the top tech trends in healthcare don’t change much year to year; end users, hospital administrators, and tech developers are all still trying to figure out what works, what works best, and how to integrate new tools into the clinical workflow, the patient experience, and the regulatory environment governing it all.
That is the story of EHRs is a nutshell: a good idea, a rush to adoption (both willing and coerced), and then a lengthy period of reiteration as all stakeholders struggle to recreate or wholistically reconsider the context in which this new system can, and should, operate. But the rush to adopt first and configure later isn’t limited to high-technology in the healthcare sector; it pretty well describes the legal environment surrounding health insurance.
Industry Leadership: Being First or Being Best?
From how it affects patients to what it is still trying to influence in the provider space, the conversation about care and coverage is still shepherded primarily by fear, secondarily by outrage, and in most other respects by confusion. So it looks like we’ll be shopping the exchanges for a while longer, even under President Trump’s watch.
After gazing into the abyss that was Trumpcare, the still-evolving status quo that is Obamacare is more popular than ever. Here again, the power of being first seems to provide some residual sticking power to a law frequently and publicly dragged through the mud by people and organizations with at least as much visibility and influence as one like T-Mobile.
At the beginning of their existence, electronic health records (EHRs) were primarily used as a document management system. Now, they have realigned their objectives and value to the physicians and practices they serve, to focus on data intelligence. If specialty practices want to stay independent they need to continue to evolve, prioritize value-based care and stay profitable. Moreover, they need the right partners to help enhance operational efficiency, increase patient engagement and achieve better clinical outcomes. As such, the scope of the EHRs responsibility for the practice’s health, growth, and sustainability has increased exponentially.
How will specialty practices ensure their future? By leveraging the power of clinical and operational data in their EHR and supplemental business applications, working together within the healthcare IT (HCIT) ecosystem. Businesses across all industries analyze data to measure overall industry performance. Metrics are the foundation for any successful business and physician groups are not excluded. Metrics should be the driving force behind every major decision that will boost productivity. However, physicians are not data scientists, but by utilizing the next generation HCIT systems, they can employ technology that will streamline the decision making process.
Challenges turn into opportunities
According to the Centers for Medicare and Medicaid Services (CMS), 171,000 physicians who did not collect and use data to comply with government regulations are looking at a three percent Meaningful Use penalty in 2017. Coupled with a new focus on value-based care requirements playing a critical role in care and outcomes, upgrading their data platform and capabilities should be the number one priority to comply with new industry standards. Data driven HCIT solution providers can prepare specialty practices for these coming changes. They help collect and analyze data to ensure effective treatment plans at lower costs.
Bottom line: This helps improve patient health and satisfaction.
Today’s HCIT systems are considered business tools that help physicians analyze data and reveal insights to use for enhanced decision making. Popular “big-box” HCIT systems try to be all things to all providers, yet they are tailored to hospitals and primary care physicians—many who typically see far fewer patients in a day than specialists. This puts a major burden on specialists, who rely on different clinical and operational data to help maximize outcomes.
Specialists potentially see up to 60 patients a day – and cover surgeries, follow-ups and everything in between. Generic HCIT systems fall short in relation to appointment volume. Combined with the fact those systems make data entry inefficient, impede clinical workflows, and lack business metrics, this is the major argument for specialty-focused HCIT solutions. Some groups acquired by hospitals or health systems have not adopted the integrated systems of their new parent companies. Instead, they stay with their specialty HCIT systems—interoperable with their parent companies’ technology—because of their ability to serve existing, proven workflows.
Data insights and a workflow makeover
Specialty HCIT systems that analyze a variety of data and provide practices with the knowledge to improve their performance will deliver the best outcomes for patients and practices. Analyzing operational data provides an understanding of how to deliver the best patient care at the lowest cost, thereby delivering optimal outcomes and increasing patient satisfaction levels.
Specialists should take the opportunity to re-evaluate their EHR and determine if their goals are helped or hindered by their current HCIT ecosystem. A productivity-boosting HCIT system can harness the power of data to deliver clinical and business applications, workflows, and insight through one user interface and make compliance with reporting requirements simple and straightforward.
Guest post by Chris Lukasiak, senior vice president, MyHealthDirect.
In the U.S., more than a third of patients are referred to a specialist each year, and specialist visits constitute more than half of outpatient visits. Referrals are the link that make this connection between primary and specialty care. From 1999 through 2009 alone, the absolute number of visits resulting in a physician referral increased 159 percent nationally, from 41 million to 105 million. This volume and the frequency of specialty referrals has steadily increased over the years and will only continue. Yet despite this rise in frequency, the referral process itself has been a great frustration for years.
Specialty referrals are a complicated business. There are many moving parts and players that all have a crucial role to play within the process. By breaking it down and looking at exactly what a referral is, who is involved, and the challenges they face, we can then look to fix what is broken. What needs to be improved? And could there be a digital solution?
Let’s start from the very beginning by looking at the stakeholders and their unique interests and concerns.
Patient – The patient experiences a health concern and needs care to get it resolved. The primary physician doesn’t provide the full solution and refers them to a specialist with more expertise about the patient’s condition. This is where the referral occurs. Currently, the extent of the referral is the physician handing a phone number to the patient to call and schedule the appointment. It’s up to the patient to contact the specialist and follow through with the next step, which explains why 20 percent of patients never even schedule the referral appointment.
Provider –There is more than one provider involved in the referral process. First is the referring (or sending) provider and then the target (or receiving) provider. The referring physician is the provider recommending (referring) them to a specialist. The target provider is the specialist that has been recommended. For a health system or physician group, there are obvious financial and quality of care benefits associated when a patient is sent to a trusted provider within network. When patients don’t go to their referral appointment, the health system or physician group loses in several ways. First of all, they have lost control over providing comprehensive care to the patient. If a patient gets readmitted to a hospital because of their negligence to follow through on a referral appointment, the health system gets penalized for the readmission. The penalty could result in CMS withholding up to 3 percent of the funding provided to the health system. The system also suffers in terms of the perception of their quality of care. If a patient is not secured with a provider within network, they may go to a competing system.
Plan – Health plans have several important considerations when a referral happens with a vested interest on three fronts to ensure the patient goes to the target provider:
1) The health plan benefits if the patient goes to a target provider within their network. Not only will patients be directed to providers that best meet their needs, but the plan also benefits when patients are referred to the providers in their Smart Network. These providers are trusted for superior care for the patient and reduced costs for the plan.
2) When a plan member doesn’t get the care they need to maintain good health, their likelihood of having major adverse events rises dramatically. This means they will end up in the ER or needing other expensive care, which represents big costs for the health plan.
3) The current approach to referrals often results in long lead times, which makes for a poor patient experience and can increase costs.