One of the sole national issues that the government can agree on is bringing down the astronomical costs of drug prices in the United States. The Senate Finance Committee and House Oversight Committee began to take on drug prices by holding two hearings this year on the issue with leading pharmaceutical companies, and a third planned for early April to examine pharmacy benefit managers’ role in the issue after they were called out as major contributors in previous hearings.
During the February meeting, the biggest players in the industry testified before the committee and it became clear that big pharma and the government are having trouble coming to a solution.
There has been much discussion around this critical issue of drug pricing in America but now a few innovators are shifting the conversation to how precision medicine provides hope for a possible solution. Therapeutics come with the risk of not working effectively for some patients and causing adverse side effects. However, applying precision medicine stratifies patients based on their disease biology and matches that patient group to the drugs that target their specific disease.
The drug prescribed will be more likely to effectively work and the cost of treating the patient group will decrease. While the House Committee on Oversight and Reform’s investigation can eventually lead to legislation to mitigate price hikes, precision medicine technology currently exists to be a key driver in deflating drug spending for patients and payers and, thus, reducing drug prices. Two ways this type of technology can help to lower drug costs are: cutting spending on ineffective treatments and motivating the industry to develop and market personalized treatments.
The healthcare industry currently wastes roughly $2.5 billion in ineffective treatments annually. Take for example, the 1.3 million Americans living with rheumatoid arthritis (RA) – because autoimmune diseases like RA vary significantly from patient to patient, blockbuster therapies, which are “targeted” therapies and therefore only treat one pathway of the disease, are not effective for 66 percent of patients taking them.
This means that patients who see little or no benefit from these treatments are still paying up to $38,000 annually, proving to be a significant financial and emotional burden to patients and their loved ones. As medicine advances, it is becoming more apparent that giving these drugs to everyone diagnosed with the respective diseases no longer works, they need to be prescribed to sub-groups of patients based on their specific disease biology to truly be effective.
Through RNA data analysis, precision medicine can both help patients with autoimmune diseases like RA find targeted treatments that they are more likely to respond to before treatment is prescribed and direct pharmaceutical companies to develop targeted drugs specifically for those who do not respond to currently available therapies.
Thanks to the advent of technologies, we can witness irreversible changes in our daily life today. It is now easier to cope with our everyday routine because we have smart solutions that speed up the pace of our life and make it more convenient.
Healthcare is where technologies are expected to revolutionize treatment and research methods we are used to so much. Hence, we have prepared some technological trends that will make healthcare more advanced in 2019.
The Internet of Medical Things
The notion of The Internet of Medical Things (IoMT) is tightly connected with wearables. This technology is designed to transmit the patient’s data through various sensors and gadgets attached to the patient’s clothes or directly to their body. Fitness trackers and smart sensors are specifically elaborated to measure blood pressure, glucose level, pulse, heart rate, etc. Along with that, you can count calories you’ve burned, and miles walked.
Well, it doesn’t sound that innovative in 2019. But what makes it one of the most progressive trends is that the gathered data can be used in many various and innovative ways. For example, preventive medicine can benefit a lot with the help of IoMT. Research gets more accurate and timely, and it is even possible to prevent epidemics using the stats gathered in this way.
Anyway, if this is your college topic and you need thorough research of the field, you can consider getting case study help by ordering your paper online to ensure the highest quality and most accurate statistics.
Telehealth
Well … yes, consulting a doctor through your telly looks like a scene from one of those futuristic novels showing what the world would look like in the 21st century. In fact, it is what we have now. Modern technologies allow us to forget about hours spent in a clinic waiting for your doctor to invite you. That also includes waiting for the results of your tests.
Now you can consult any doctor in the world having a computer and an Internet connection. Imagine that you needed to see a reputable specialist in another country. It would be highly inconvenient to go all the way there just for a consultation. Firstly, you’d have to spend a lot of time. Secondly, you might need help to move around. And finally, it would be costly.
Today, you can contact your doctor from any spot in the world and get their consultation. It will not work in emergency cases, but it can work well if you need help with urgent but small issues. This is a good possibility for those who reside in far rural areas or require a highly specialized doctor to receive timely medical assistance.
The technology can also allow people to get consulted more frequently, which will improve the overall health of the population and establish better relations with doctors.
By Scott E. Rupp, publisher, Electronic Health Reporter.
The healthcare technology world is ablaze, on FHIR. New proposed standards for interoperability are being established to allow health systems the ability to share information and facilitate patient access to data. Specifically, in large part through a structure known as FHIR.
This “FHIR” the market is speaking of is Fast Healthcare Interoperability Resources, an interoperability standard for electronic exchange of healthcare information. FHIR was developed by Health Level Seven International (HL7), a not-for-profit that develops and provides frameworks and standards for the sharing, integration and retrieval of clinical health data and other electronic health information.
FHIR emerged in 2014 as a draft standard for trial use to enable health IT developers to more quickly and easily build applications for EHRs and to exchange and retrieve data faster from applications. FHIR soon received support from EHR vendors like Epic, Cerner and AthenaHealth. Shortly thereafter, the Argonaut Project emerged to move FHIR forward, and in February 2017, FHIR became a full data exchange standard.
FHIR is interoperability
FHIR is built on the concept of interoperability and modular components that can be assembled into working systems to try to resolve clinical, administrative and infrastructural problems in healthcare.
FHIR provides software development resources and tools for administrative concepts, such as patients, providers, organizations and devices, as well as a variety of clinical concepts including problems, medications, diagnostics, care plans and financial issues, among others. FHIR is designed specifically for the web and provides resources and foundations based on XML, JSON, HTTP, Atom and OAuth structures.
FHIR can be used in mobile phone applications, cloud communications, EHR-based data sharing and among institutional healthcare providers.
According to HL7, FHIR aims to simplify implementation without sacrificing information integrity. FHIR “leverages existing logical and theoretical models to provide a consistent, easy to implement and rigorous mechanism for exchanging data between healthcare applications. FHIR has built-in mechanisms for traceability to the HL7 RIM and other important content models. This ensures alignment to HL7’s previously defined patterns and best practices without requiring the implementer to have intimate knowledge of the RIM or any HL7 v3 derivations.”
Health sector buy-in
The healthcare sector has clearly bought into FHIR, primarily because of interoperability challenges.
“Sharing data between different health systems has required significant investment of IT resources on one-off projects,” said Nilesh Chandra, healthcare expert at PA Consulting. “As the needs for data sharing have increased, hospital IT departments have been swamped with demand for all of this custom integration.
“FHIR and similar standards are an attempt at standardizing data integration, to make it easier to connect EHR systems and easily extract or upload data into them, based on reusable IT components,” added Chandra. “That said, FHIR is an important step in the right direction, but is not the panacea for all health IT integration issues.”
FHIR uses a set of commonly used medical ideas termed as “resources.” The resources are used across many different types of companies and organizations, but can all mean the same thing. An example would be blood pressure readings, or an MRI scan. Those resources are held in EHRs, smartphones, health information exchange databases and so on. FHIR also allows for the mining of those elements since they are tagged in a similar way in the FHIR standard.
“The complex part is done by individual systems that don’t have the same operating system,” said Jason Reed, PharmD blog founder. “Because they can pull that tag then they pull it and exchange it with other entities. They only show the tag and not the other code or structures they had to use to get to that tag.”
While consolidated clinical document architecture allows a group of healthcare items to be sent together, this is essentially like sending an electronic PDF, Reed said. Other systems that have different operating systems can’t break that down unless they use the same operating system.
FHIR’s culmination
All of this is a culmination of the fact that digital health data can improve outcomes and lower costs, but the reality has been something less than ideal. For example, during the economic stimulus in 2009, systems were designed before modern web standards for storing and exchanging data were ubiquitous. The industry was caught in the middle of a technical revolution and spent its cash before the best new practices were available, said Nick Hatt, senior developer at Redox.
By Danny Allan, vice president of product strategy, Veeam.
The healthcare industry is in the midst of a transformation with the widespread use of electronic health records (EHR), the increasing reliance on connected devices, and the move to cloud infrastructures. Accessing health records for accurate communications, diagnosis and treatment has never been easier, but protecting them may require a different treatment plan.
Traditionally, healthcare IT teams are faced with a long list of time sensitive spending priorities, leaving administrative investments like data protection and disaster recovery languishing at the bottom of the pile. However, with HIPAA and HITECH, the growing volume and influx of data, and critical patient care systems relying more on IT, a disaster recovery plan is moving closer to the top of the priority lists. But one question still arises: How prepared are healthcare providers to keep this information protected and continuously available to ensure the delivery of the next-generation of patient care?
Disaster recovery is the beating heart of up time
The healthcare industry faces constant pressures today to have patient information available 24/7/365 for providers and individuals. Moreover, consumers rely on accessing their private data to track appointments and treatments plans, which means it is vital that it be kept safe and well-managed. While infrastructure costs can be reduced with the use of cloud, IT still must ensure that these service level agreements are met. And so, having a backup and disaster recovery plan becomes crucial whether planning for potential power outages, accidental file deletion or natural disasters.
These contingency plans are designed not only to maintain up time, protected health information (PHI) data availability and be responsive during emergencies, but to comply with the HIPAA regulations that require healthcare organizations to have a plan to handle natural disasters, crises and data security.
One hospital’s disaster recovery plan revitalized
Rochelle Community Hospital is a not-for-profit hospital serving more than 20,000 patients each year in northern Illinois. It is the only hospital within a 30-mile radius making it an essential part of the community, especially if a natural disaster were to strike.
Like most hospitals and healthcare organizations, Rochelle Community Hospital used to follow the traditional DR plan for backup up its data using the 3-2-1 rule: three copies of data, two different media, one copy offsite. However, after a very close-call with a powerful tornado in 2015, which missed the hospital by only two miles, it became very clear to the IT team that they couldn’t properly protect data and maintain operations with their current disaster recovery plan.
At that time, Rochelle Community Hospital stored its offsite copy in a data center near the hospital – a clearly recognized mistake because if the tornado would have hit them, it would have taken out that center leaving them stranded without data. Patient care would have suffered as doctors would not have the real-time access to EHRs.
Realizing this, the Rochelle Community Hospital IT team changed its disaster recovery goal to focus on maintaining the availability of its data during a natural disaster or emergency, specifically patient EHR records. Their big challenge was to find a reliable solution that would stay inside their budget.
By Ben Flock, chief healthcare strategist, TEKsystems.
Healthcare professionals know that blockchain is coming, but there is still some apprehension associated with the technology. The cryptocurrency industry first pioneered this technology and its results have been highly impressive. But when it comes to the healthcare industry, there is a lack of proven use cases, leading to a delay in blockchain’s widespread adoption.
To pull back the curtain on the reluctance to adopt blockchain technology, TEKsystems partnered with HIMSS Analytics to host a focus group of business and technology leaders from the payer, provider, pharma and public sector. The goal: to better understand customer needs and business challenges when it comes to actually implementing blockchain.
Findings revealed that, as most in the industry already know or suspect, there is a limited overall understanding of blockchain technology. However, it seems that this limited knowledge is the foundation for most of the apprehension toward widely adopting the technology. Additional roadblocks that contribute to this apprehension include the lack of impactful use cases, fears of what the unspecified governance of data could mean for compliance, security concerns and industry politics, among others.
There is good news—those who have a basic understanding of blockchain exhibit less apprehension and a more cautious exuberance toward adoption of the technology. As understanding of blockchain grows and more practical examples of its benefits are found, the healthcare industry will become more open to implementing blockchain solutions.
During the focus group, participants discussed proven use cases for blockchain that could be used as industry examples to help increase the general understanding of blockchain technology. The group identified three main use cases that could be implemented in the near term after a short testing period: provider directory updates, expediting the provider credentialing process and prior authorization.
A provider directory was the first use case identified by the focus group. Insurance companies must provide patients with timely, accurate provider contact information and new patient availability. While Centers for Medicare & Medicaid Services (CMS) regulate provider directory services, many of them are inefficient, costly and often laden with manual processes. With blockchain, the provider ledger could be maintained through a proactive, structured, perpetual process enabling open and direct access to provider information on an as-needed basis. Because provider directory information is already public record, it’s a high-result, low-risk proof-of-concept project.
By Scott E. Rupp, publisher, Electronic Health Reporter.
On March 21 HIMSS representatives vice president of government affairs, Tom Leary, and senior director of federal and state affairs, Jeff Coughlin, hosted a roundtable with members of the media to peel back a few layers of the onion of the newly proposed ONC and HHS rules to explain some of the potential ramifications of the regulations should they be approved.
The CMS proposed regulation is attempting to advance interoperability from the patient perspective, by putting patients at the center of their health care and confirming that they can access their health information electronically without special effort.
ONC’s proposed regulation calls on the healthcare community to adopt standardized application programming interfaces (APIs) and presents seven reasonable and necessary conditions that do not constitute information blocking.
According to HIMSS’s assessment of both proposals there’s room for interpretation of each, but the organization has not yet fully formed a complete response to each as of this writing.
However, Leary said: “It’s important to emphasize that all sectors of the healthcare ecosystem are included here. The CMS rule focuses on payer world. The ONC rule touches on vendors and providers. All sectors really are touched on by these rules.”
With both, ONC and CMS is trying to use every lever available to it to push interoperability forward and is placing patients at center, Coughlin said. The healthcare sector got a taste of how CMS plans to empower patients through its recent MyHealthEData initiative, but the current proposal places more specifics around the intention of agency. Likewise, the ONC rule is attempting to define the value of the taxpayer’s investment in regard to the EHR incentives invested in the recent meaningful use program.
Key points of the rules
Some key points to consider from the rules: APIs have a role to play in future development of the sector and are seen as a real leveler of the playing field while providing patients more control of their information, Coughlin said.
HHS is focusing on transparency and pricing transparency. For example, there’s movement toward a possibly collecting charge master data from hospitals and, perhaps, publishing negotiated rates between hospitals and payers, which HHS is looking into.
What happens now that rules are out? According to HIMSS, education members is the first step to understanding it and responding to the federal bodies. “What we’ve done is focus on educating HIMSS members in briefings,” Coughlin said. “Trying to get early feedback and early impressions from members, convening weekly conference calls to address parts of the rule. Once we have critical mass then we work with executive leadership to make sure what we are hearing from membership to is reflected across the membership.”
Looking into the future?
For health systems, the broad exchange of data likely remains a concern. Data exchange within the ONC rule impacts providers and health systems in a number of ways, especially in regard to the costs of compliance to meet all of the proposed requirements.
HIMSS representatives are not currently casting a look into a crystal ball or if they are (they are), they’re not yet ready to tip their hand regarding what the organization intends to pursue through its messaging on behalf of its members.
“We’re not in a place to see where we are going to land,” Coughlin said. “We are hearing from our members about the complexities of rules and what’s included. It’s hard to overestimate how complex this is. ONC and CMS in designing broader exchange of information is something that speaks very well of them, but (this is) complex in interpretation and implementation.”
Information blocking exceptions, the default is broader sharing of information across the spectrum. More information has to be shared and expectations need to be defined, they said. From HIMSS’ perspective, compliance is the primary issue of its members. The question that needs answering is what kind of burden is being placed on health systems and providers. Leary is confident HIMSS will spend a good bit of ink in its response on citing potential concerns over information blocking and what that might mean.
“It will be helpful for the community to have examples and use cases for what’s included especially for exceptions for information blocking,” Coughlin said. “We need examples to clearly define the difference between health information exchange and health information network.”
By Abhinav Shashank, CEO and co-founder, Innovaccer.
What makes Super Bowls, banking transactions, and online search results altogether more special?
As an ardent supporter, concerned customer, and curious observer, I keep witnessing all three of them in real time. I want the best experience every time that I am the end user, and so does everyone else. In this day and age, it shouldn’t be an unrealistic dream anyway. We should be able to know the score in real time and in the same way, our credit card transactions as and when they happen.
Why doesn’t my healthcare data show the complete picture?
Ironically, for healthcare organizations, real-time updates are not always available while making decisions that can potentially impact patients throughout their lives. Traditionally, many solutions were not even made to optimize the time that providers spend with their patients. Rather, they were only built to ingest data in electronic formats, evaluate macro-level performance trends, and in the best case scenario, provide top stakeholders with financial trends in a concise manner.
Though most organizations today have business intelligence (BI) infrastructures in place, most of the insights generated through them are only good for analyzing things in retrospect and do not really assist providers in the moment of care.
Activated data is the backbone of healthcare technology
It’s one thing to know what is wrong, it is another to have a way of addressing it. For instance, notes from the last appointment with a patient can only provide care teams with half of the story. Unless care providers have a holistic pool of information regarding the patient’s whereabouts, they cannot initiate personalized care plans or impart evidence-based care.
Healthcare leadership should look for activating data from different facilities in their bid to maximize the knowledge base of their providers. Once they have all the data points, they can begin to run customized analytics to support clinical decision-making.
By Bill Kotraba, vice president, healthcare solutions and strategy, Information Builders.
Data has long been a popular topic in healthcare and is even more so after this year’s HIMSS. The industry is buzzing about the joint CMS and ONC announcement, which proposes a framework to improve interoperability and support seamless and secure access of health information. The pressure is on for healthcare to tackle their data as the two organizations strive to provide patients with the ability to leverage personal information in various applications. And, this pressure will only increase as we look into the future, making it even more imperative that payers and providers address the issue now.
Beyond interoperability
Look more closely, and you will see that with their recent announcement CMS and the ONC are focusing on healthcare organizations’ ability to manage data across the enterprise. Historically, healthcare has worked from siloed applications and data sources with light integration using interface engines. Recently, healthcare organizations have pinned their hopes on leveraging data effectively through huge investments in new EHR platforms. The reality, pointed out by government officials at HIMSS in Orlando, is that this still results in significant challenges for healthcare organizations to manage information across the data value chain.
Although not part of their proposed framework, CMS and the ONC point out the need for better patient mastering across data sources. Organizations hoped their investment in a centralized EHR platform would solve this but that has proven to not be the case. In addition to patient data, healthcare organizations face challenges in mastering physician data, which can have wide impact, including on value-based care initiatives. The joint proposal also highlights that the ability to push back accurate, cleansed data to source systems is critical.
Healthcare needs a unified approach
Using FHIR to stop data blocking and push the industry towards a standards-based approach will help, but it’s not sufficient for the data challenges facing healthcare organizations. In addition to tackling the issues pointed out at HIMSS, healthcare organizations must: