To face and handle several challenges along the way, the healthcare industry is looking towards the IT sector for the best tools and equipment. As demands for better treatment and diagnostic procedures continue to rise, it is best for healthcare organizations, especially hospitals, to upgrade their infrastructure and deliver the best results to this end.
Big data, demands for better therapeutic methods, as well as increasing management-side complexity are challenges that clinics and hospitals will have to address. Automation is nothing new in this respect, but it demands wider adaptation among healthcare organizations that struggle with outdated equipment and lackluster patient information management.
With that being said, it is imperative for these organizations to look into hospital management systems and how they can help streamline regular and complex operations.
Automation saves costs
Automation points the way to the future of healthcare technology. One thing’s for sure, there will be a high dependence on automated systems for such areas as healthcare denial management and revenue accounting. Through an effective software product, a hospital can make significant cuts to operational costs, enabling the savings to be channeled towards the development of better facilities and the procurement of advanced equipment.
Automation lightens the workload
Hospital staff have a lot of things on their plates. More often than not, they will have to handle routine tasks such as validating patient data and organizing a large bulk of information. Using intelligent solutions to everyday responsibilities enables you to lighten the workload on your staff so they can focus on more important functions.
Automation streamlines medical billing
Another high point of using effective hospital management software is that it allows an organization to make proper computations for their patients. This has always been a challenge that hospitals need to endure way back when accounting software was not as sophisticated as it is now. But with recent innovations in modern tech, it is possible for hospitals to reduce the amount of paperwork in accounting and to bill their patients without the possibility of a dispute.
The healthcare industry is in a period of great uncertainty, with major questions looming around how regulations, standards and reimbursements – particularly regarding care quality and interoperability– will be changing for hospitals in the coming year. One thing is clear though: In order to provide the efficient and high-quality care needed to meet patient expectations, hospitals need to focus on the intelligent application of new technologies. Here are four trends that will influence healthcare IT in 2018:
The opioid epidemic will trigger growth in investments around patient and staff safety
The growing opioid epidemic now causes nearly 100 deaths each day, and is projected to cause 500,000 deaths over the next decade, primarily due to overdoses. That is not only putting pressure on hospitals to reevaluate how they use opioid medications and monitor patients once back in the community, but it is also forcing them to address the physical safety of staff and patients. This is because the opioid epidemic has led to an increase in violent crimes in healthcare facilities. Emergency departments in particular are under heavy strain, with more patients presenting with addiction symptoms, compounding wait times and leading to more patient disputes. Hospitals will have to invest significantly more in technologies to protect staff and patients, such as patient monitoring solutions and staff duress systems to prevent potentially dangerous patients from harming themselves or others.
Big data advancements will pave the way for the rise of predictive and prescriptive analytics
Regardless of how the major causes of uncertainty affecting the healthcare industry – such as the future of the Affordable Care Act – resolve themselves, it is certain that there will be no return to the pre-ACA era. As healthcare industry writer and consultant Edgar Wilson has pointed out in the context of primary care, the expansion of insurance coverage did not magically create more capacity. It challenged hospitals to find new ways to serve more patients, more personally, without adding cost. Hospitals will continue to look for practical ways to improve their efficiency by leveraging data to better predict patient care requirements, and demand for medications and equipment needs. The benefits of these predictive analytics capabilities are enormous.
According to a February 2017 report by the Society of Actuaries, 93 percent of healthcare providers said predictive analytics is important to the future of their business, and 57 percent believe predictive analytics will save their organization 15 percent or more over the next five years. In addition to predictive analytics, prescriptive analytics will have a growing impact. Ongoing advancements in the collection, aggregation and analysis of data will provide hospitals with greater operational insights, enabling them to optimize staffing levels and other aspects of operations while enabling staff members to deliver more effective, targeted care.
Staffing shortages combined with rising care expectations will drive adoption of AI and automationContinue Reading
Guest post by Alexandra Roden, content editor, Connexica.
Just a few years ago, big data and the Internet of Things (IoT) were terms generally unheard of. This year they continue to revolutionize technology and the ways in which we acquire and process data, but what do they mean for the healthcare industry?
Xenon Health describe IoT as “a phenomenon through which the operational aspects of the physical world become increasingly integrated with digital platforms, enabling information to move seamlessly toward the computational resources that are able to make sense of it.” Essentially, IoT goes hand-in-hand with the mobile age and the diversity of data that is currently being retrieved from agile and mobile locations.
Big data is a related concept – it addresses the ever-increasing amounts of data that are created every second of every day and recognizes that these figures will only continue to grow. For example, in the “social media minute” every single minute there are 277,000 tweets are sent, Whatsapp users share 347,222 photos and Google receives more than 4,000,000 search queries. These figures are remarkable even for those of us caught up in the social media hype, and most shocking of all is the realization that the global Internet population now represents 2.4 billion people. That’s a lot of people creating a lot of data – the question now is how we can utilize this data in a meaningful way.
IoT has revolutionized many industries and will continue to do so in the foreseeable future, but what about healthcare? Organisations within this industry tend to adopt new technologies slowly, relying upon solid evidence and demonstrable impact and efficiency before committing to any such change. The shift towards IoT is, however, beginning to take place, and increasing amounts of available patient data are beginning to inform decision making processes within this sector.
By Darin M. Vercillo, MD, chief medical officer and co-founder, Central Logic.
Healthcare has been changing rapidly for the last 60 years and advances have now reached record speed, including in the realm of data intelligence. In trying to keep pace as well as to protect and advance their own businesses, many processes and systems have understandably been organized into silos. That era must come to a close.
Care coordination teams need rich collaboration of data and must now be connected. Hospitals, clinics, home health care workers, primary care physicians, vendors, and others must speak with each other, in the same language, and completely share patient data with an open, collaborative attitude. The industry is all abuzz with this uncharted territory called interoperability. It is clear that data warehouses, now bursting with valuable information, must be streamlined for three very simple reasons: patient safety, cost-effective healthcare delivery and overall population health management. A happy byproduct when data intelligence becomes actionable and systems work collaboratively is a financial benefit, but as a physician, I believe excellent patient care always wins the day, and should be the driving factor.
At the risk of this being looked at as “just a financial issue,” consider also that hospitalization is generally a marker for severe illness. Our goal is a healthier population. As we (patients and providers) succeed collectively with hospital treatment and post-acute care, then re-admissions will naturally decrease, and patients will live healthier, more satisfied, lives. Ultimately, this is our goal.
Appropriate, timely sharing of vital patient information will not only address re-admission rates that have clearly become egregious, but improved collaboration of data needs to happen to better inform decision making at the point of care. Without a keen eye to patient safety and success, it is too easy for details to slip through the cracks. All too often, history has demonstrated that hand-off points are the riskiest for failures in patient care.
Nearly everyone has a story where the current system has failed patients — just ask Jennifer Holmes, our CEO. Her father’s healthcare team made an error in medication that ultimately cost him his life. Similar medication errors and decreased duplicate testing can be avoided when a patient’s entire care coordination team has visibility into the data – all the data – to improve care efficiencies and diagnoses.
But all this sharing and playing nice in the sandbox is easier said than done.
Guest post by Lucy Doyle, Ph.D., vice president, data protection, information security and risk management, McKesson, and Karen Smith, J.D.,CHC, senior director, privacy and data protection, McKesson.
Today there are opportunities and initiatives to use big data to improve patient care, reduce costs and optimize performance, but there are challenges that must be met. Providers still have disparate systems, non-standard data, interoperability issues and legacy data silos, as well as the implementation of newer technologies. High data quality is critical, especially since the information may be used to support healthcare operations and patient care. The integration of privacy and security controls to support safe data handling practices is paramount.
Meeting these challenges will require continued implementation of data standards, processes, and policies across the industry. Data protection and accurate applications of de-identification methods are needed.
Empowering Data Through Proper De-Identification
Healthcare privacy and security professionals field requests to use patient data for a variety of use cases, including research, marketing, outcomes analysis and analytics for industry stakeholders. The HIPAA Privacy Rule established standards to protect individuals’ individually identifiable health information by requiring safeguards to shield the information and by setting limits and conditions on the uses and disclosures that may be made. It also provided two methods to de-identify data, providing a means to free valuable de-identified patient level information for a variety of important uses.
Depending on the methodology used and how it is applied, de-identification enables quality data that is highly useable, making it a valuable asset to the organization. One of the HIPAA- approved methods to de-identify data is the Safe Harbor Method. This method requires removal of 18 specified identifiers, protected health information, related to the individual or their relatives, employers or household members. The 18th element requires removal of any other unique characteristic or code that could lead to identifying an individual who is the subject of the information. To determine that the Safe Harbor criteria has been met, while appearing to be fairly straightforward and to be done properly, the process requires a thorough understanding of how to address certain components, which can be quite complex.
The second de-identification method is the expert method. This involves using a highly skilled specialist who utilizes statistical and scientific principles and methods to determine the risk of re-identification in rendering information not individually identifiable.
We need to encourage and support educational initiatives within our industry so more individuals become proficient in these complex techniques. At McKesson, we are educating our business units so employees can better understand and embrace de-identification and the value it can provide. This training gives them a basic understanding of how to identify and manage risks as well as how to ensure they are getting quality content.
Embracing Social Media and New and Improved Technologies
One of the challenges we face today in de-identifying data is adapting our mindset and methodologies to incorporate new emerging technologies and the adoption of social media. It is crucial to understand how the released data could potentially be exposed by being combined with other available data. New standards are needed.
While de-identifying data can be challenging and complex, the task is made easier when we remember and adhere to our core directive to safeguard data. With this in mind incorporating new technologies is part of an ongoing process of review.
When done properly, de-identification enables high quality, usable data, particularly when the expert method is used. De-identification should not be viewed as an obstacle to data usage, but rather as a powerful enabler that opens the door to a wealth of valuable information.
HIMSS organizers, in preparation of its annual conference and trade show and as a way to rally attendees around several trending topics for the coming show, asked the healthcare community how it feels about several key issues. I’ve reached out to readers of this site so they can respond to what they see as the future of healthcare innovation, data security, patient engagement and big data.
Their responses follow.
Do you agree with the following thoughts? If not, why; what’s missing?
Sean Benson, vice president of innovation, clinical solutions, Wolters Kluwer Health Future innovations in health IT, big data in particular, will focus on the aggregation and transformation of patient data into actionable knowledge that can improve patient and financial outcomes. The ever-growing volume of patient data contained within disparate clinical systems continues to expand. This siloed data often forces physicians to act on fragmented and incomplete information, making it difficult to apply the latest evidence. Comprehensive solutions will normalize, codify and aggregate patient data in a cloud system and run it against clinical scenarios to create evidence-based advice that is then delivered directly to the point of care via a variety of mobile devices. This will empower physicians with patient-specific knowledge based on the latest medical evidence delivered to the point of care in a timely, appropriate manner, ultimately resulting in higher quality treatment and more complete care.
Susan Reese, MBA, RN, CPHIMS, chief nurse executive, Kronos Incorporated Gamification — the trend of creating computer-based employee games and contests for the purpose of aligning employee productivity with the organization’s goals — is currently a popular topic with business leaders and IT. For proof, consider that Gartner recently projected that by 2015, 50 percent of all organizations will be using gamification of some kind, and that by 2016, businesses will spend a total of $2.6 billion on this technology.
With numbers like these, it is clear that that gaming is serious business and that it is here to stay. But at this point, you may be asking yourself, “Could gamification work in my healthcare environment? What potential benefits could it have?””
Today, many healthcare organizations are looking to the future and considering gamification as a way to increase employee engagement, collaboration, and productivity as well as to align their behavior with larger business goals – but they don’t know how to do it quite yet. Also, gamification can be a delicate decision, complete with advantages and risks. After all, employees’ day-to-day work responsibilities and careers are not games and can’t be trivialized. Healthcare organizations must be careful to avoid sending the wrong message to their workforce, or the whole program could backfire, or even lead to more negative consequences.
Mike Lanciloti, vice president of product management and marketing, Spectralink In today’s digital age, healthcare IT needs to come a long way to get up to speed in innovation and connectivity. However, as we begin to see mobile play a larger role in the industry, healthcare is moving the needle on innovation as well.
The mobile revolution has picked up in healthcare for both health IT professionals and in patient care. Primary as healthcare providers find ways to utilize smartphones, mobile devices and Wi-Fi networks to improve the communication and efficiency of their workforce.
Through mobile devices, clinicians have the ability to access what they need, when they need it. Mobile devices ensure nurses and mobile staff are equipped with the right technology to promote timely, efficient and reliable communication. This not only allows healthcare professionals to perform their jobs more effectively but also helps deliver a higher quality of patient care.
The growing mobile trend does present several questions for the industry. Hospital managers are quickly learning that an influx of smartphones into the hospital setting can become a larger problem than anticipated. Not only do personal devices lack the security required for enterprise-owned devices, they pose other risks, calling into question issues surrounding encryption, authorized access and mobile security. Personal phones aren’t designed to be equipped with the same encryption capabilities as enterprise-owned mobile devices.
Dell unveils findings from its first Global Technology Adoption Index (GTAI), uncovering how organizations truly using security, cloud, mobility and big data to drive success. The market research surveyed more than 2,000 global organizations and found that security is the biggest concern in adopting cloud, mobility and big data. Furthermore, while 97 percent of organizations surveyed use or plan to use cloud and nearly half have implemented a mobility strategy, big data adoption is trailing as approximately 60 percent of organizations surveyed do not know how to gain its insights.
“We know that security, cloud, mobility and big data are the top IT priorities in all industries, but we need a deeper understanding of the practical realities of how companies are using these technologies today and what, if anything, is preventing them from unleashing their full potential,” said Karen Quintos, chief marketing officer, Dell. “This research cuts through the hype and provides a clearer roadmap for how Dell can enable our customers to thrive.”
“Despite mounting security risks and increased reliance on the Internet and technology to run their businesses, many small and midsize organizations are underprepared to deal with today’s security threats, let alone those of the future,” said Laurie McCabe, partner, SMB Group. “These companies know that disruptive technologies like cloud, mobility and big data can drive innovation and create competitive advantage. But it’s often difficult for them to take a strategic approach and overcome security concerns in order to fully harness the potential.”
Security Concerns Are Creating Big Barriers The Dell GTAI found that IT decision-makers still consider security the biggest barrier for expanding mobility technologies (44 percent), using cloud computing (52 percent) and leveraging big data (35 percent). While security concerns are holding organizations back from further investing in major technologies, a lack of readily available security information is similarly preventing organizations from being prepared during a security breach. Only 30 percent of respondents said they have the right information available to make risk-based decisions, and only one in four organizations surveyed actually has a plan in place for all types of security breaches.
The security barrier becomes even more serious as the C-suite becomes less engaged. Only 28 percent of organizations polled have a C-suite mindset that is fully engaged with security initiatives. However, in organizations where executive leadership is involved in security, confidence is markedly increased. Among organizations that are very confident in their security, 84 percent of senior leaders are fully or somewhat engaged, compared to only 43 percent of senior leaders at organizations who are not confident in their security.
Other significant Dell GTAI security findings include:
MeriTalk, a public-private partnership focused on improving the outcomes of health and government IT, announces the results of its new study, “FutureCare: Cloud, Big Data, Mobile, and Social Optimize the EMR.” The report, sponsored by EMC Corporation, explores how FutureCare-enabling technologies — cloud, big data, mobile and social — are driving profound change and how deployment of these tools can help optimize electronic health records for improved patient care coordination.
The report purports to reveal that while many providers have implemented or plan to implement these technologies in the next two years, 96 percent of healthcare organizations say their infrastructure is not fully prepared for the evolution of their EHR today.
Health IT leaders have started to adopt FutureCare technologies. Two-thirds of healthcare providers run EHR applications in the cloud, with the majority currently using private cloud models (49 percent), followed by hybrid and public clouds (35 percent).
Healthcare providers are also using big data and analytics in conjunction with their EHR with 50 percent saying big data is helping them to reduce re-admissions and track and evaluate patient outcomes more effectively. Providers are also using big data to conduct cost/benefit analysis to reduce project risk (46 percent), manage clinical and IT staffing levels (38 percent) and prescribe preventive care (24 percent).
Guest post by Anil Jain, MD, FACP, chief medical officer, Explorys, and staff, Department of Internal Medicine, Cleveland Clinic.
Nearly every aspect of our lives has been touched by advances in information technology, from searching to shopping and from calling to computing. Given the significant economic implications of spending 18 percent of our GDP, and the lack of a proportional impact on quality, there has been a concerted effort to promote the use of health information technology to drive better care at a lower cost. As part of the 2009 American Reinvestment and Recovery Act (ARRA), the Health Information Technology for Economic and Clinical Health (HITECH) Act incentivized the acquisition and adoption of the “meaningful use” of health IT.
Even prior to the HITECH Act, patient care had been profoundly impacted by the use of health informationtechnology. Over the last decade we had seen significant adoption of electronic health records (EHRs), use of patient portals, creation of clinical data repositories and deployment of population health management (PHM) platforms — this has been accelerated even more over the last several years. These health IT tools have given rise to an environment in which providers, researchers, patients and policy experts are empowered for the first time to make clinically enabled data-driven decisions that not only at the population level but also at the individual person level. Not only did the 2010 Affordable Care Act (ACA) reform insurance, but it also has created incentive structures for payment reform models for participating health systems. The ability to assume risk on reimbursement requires leveraging clinical and claims data to understand the characteristics and needs of the contracted population. With this gradual shift of risk moving from health plans and payers to the provider, the need to empower providers with health IT tools is even more critical.
Many companies such as Explorys, a big data health analytics company spun-out from the Cleveland Clinic in 2009, experienced significant growth because of the need to be able to integrate, aggregate and analyze large amounts of information to make the right decision for the right patient at the right time. While EHRs are the workflow tool of choice at the point-of-care, an organization assuming both the clinical and financial risk for their patients/members needs a platform that can aggregate data from disparate sources. The growth of value-based care arrangements is increasing at a staggering rate – many organizations estimate that by 2017, approximately 15 percent to 20 percent of their patients will be in some form of risk-sharing arrangement, such as an Accountable Care Organization (ACO). Already today, there are currently several hundred commercial and Medicare-based ACOs across the U.S.
There is no doubt that there are operational efficiencies gained in a data-driven health system, such as better documentation, streamlined coding, less manual charting, scheduling and billing, etc. But the advantages of having data exhaust from health IT systems when done with the patient in mind extend to clinical improvements with care as well. We know that data-focused health IT is a necessary component of the “triple-aim.” Coined by Dr. Donald Berwick, former administrator of the Centers for Medicare and Medicaid Services (CMS), the “triple-aim” consists of the following goals: 1) improving health and wellness of the individual; 2) improving the health and wellness of the population and 3) reducing the per-capita health care cost. To achieve these noble objectives providers need to use evidence-based guidelines to do the right thing for the right patient and the right time; provide transparency to reduce unnecessary or wasteful care across patients; provide predictive analytics to prospectively identify patients from the population that need additional resources and finally, use the big data to inform and enhance net new knowledge discovery.