By James D’Arezzo, CEO of Condusiv Technologies.
While advancements in data collection, analytics and electronic health records can lead to better healthcare, it also creates challenges for the healthcare sector’s IT departments. That’s because serious obstacles exist in terms of systems management and capacity. The sheer amount of healthcare data has skyrocketed from even a decade ago. Through advanced predictive analytics, this data can save lives by fostering the diagnosis, treatment and prevention of disease at a highly personalized level.
To maximize the benefits all of this information can offer, healthcare organizations will need to make significant investments in data storage and infrastructure. With simple software fixes, many healthcare IT departments could easily free up half their bandwidth — essentially doubling IT budgets — by more efficiently using the infrastructure already in place.
The health data tsunami
Healthcare institutions must comply with more than 629 different regulatory mandates in nine domains, costing the average community hospital between $7.6 and $9 million. Much of that spending is associated with meaningful use requirements –- government standards for how patient records and data are stored and transmitted. The average hospital spent $760,000 on meaningful-use requirements and invested an average of $411,000 in hardware and software upgrades for their records systems in 2016 alone
Because of the demands of healthcare record-keeping and continued advancements in medical technology, IT spending is rising exponentially. Along with that, medical research and development is booming to the point that institutions can’t keep up with the amount of data that needs to be stored and analyzed. Pharmaceutical and healthcare systems developers are also affected by the gap between data acquisition and analysis. Life sciences companies are launching products faster and in a greater number of therapy areas.
This fast-paced technological evolution places even more pressure on healthcare IT departments to deliver both innovation and efficiency.
Performance degradation occurs over time as the input/output (I/O) movement of data between the storage and computer/presentation layers declines. This degradation is particularly prevalent in the Windows environment. Luckily, targeted software solutions do exist that can improve system throughput by up to 50 percent without additional hardware.
If I/O drags, performance across the entire system slows, which primarily impacts computers running on Microsoft SQL servers (the most popular database in the world.) The Microsoft operating system is also notoriously inefficient with I/O. In fact, I/O degradation is much more common than most organizations realize. More than a quarter of organizations surveyed last year reported that poor performance from I/O-heavy applications was slowing systems down.