Tag: Condusiv Technologies

Measles Epidemic Exposes Performance Gaps In U.S., Global Health Data Analytics

Measles continues to spread in the United States as health officials seek to stem the worst outbreak of the disease in decades. More than 700 cases have now been reported, about half if them involving children under the age of five.1

James D’Arezzo

James D’Arezzo, CEO of Condusiv Technologies, says, “It is the job of our healthcare database networks to map a situation like this in order to help caregivers control it.” D’Arezzo, whose company is the world leader in I/O reduction and SQL database performance, adds, “Unfortunately, some pieces of this network are missing, and a number of others don’t work very well.”

Experts in the field agree. According to a recent report by team of scientists led by the National Institute of Health, while analysis of data derived from electronic health records, social media and other sources has the potential to provide more timely and detailed information on infectious disease outbreaks than traditional methods, there are significant challenges to be overcome. Big data offers a “tantalizing opportunity” to predict and track infectious outbreaks, but healthcare’s ability to use it for such purposes is decades behind that of fields like climatology and marketing.2

Nonetheless, progress in data sharing has been made. State, local, and territorial health departments now have access to healthcare-associated infections data reported in their jurisdictions to the Center for Disease Control’s National Healthcare Safety Network (NHSN). Thirty-three states and the District of Columbia now use NHSN for that purpose.3

However, D’Arezzo notes, this data has its origins in a multiplicity of far-flung healthcare organization IT systems. To be usable, it must be pulled together through millions of individual input-output (I/O) operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment. The most widely used operating system, Microsoft Windows, is in many ways the least efficient; the average Windows-based system pays a 30 percent to 40 percent penalty in overall throughput capability due to I/O degradation.4

Continue Reading

Optimizing System Infrastructure Is Healthcare Data’s Most Urgent Issue

In a drive for standardization, the proposed U.S. Core Data for Interoperability1, mandated by the 2015 21st Century Cure Act2, calls for the identification and refinement of a basic set of clinical data to be required for all electronic health records. The American Medical Informatics Association, however, recommends that healthcare researchers and providers concentrate on sharing data as quickly as possible, deferring the creation of standards until sometime in the future.3 “This is a false choice and a distraction,” said James D’Arezzo, CEO of Condusiv Technologies.

James D’Arezzo

D’Arezzo, whose company is the world leader in I/O reduction and SQL database performance, adds, “What the healthcare industry really needs to focus on is enabling its heavily overburdened IT infrastructure to do the job it’s being asked to do.”

The industry’s preoccupation with interoperability and standardization, notes D’Arezzo, is perfectly understandable. Turf wars over proprietary interfaces and protocols are having a major impact on healthcare IT budgets.4 Non-compatible electronic health records contribute significantly to the fact that computerized record keeping consumes more than 50 percent of the average physician’s workday, which now stretches to more than 11 hours.6 Healthcare organizations struggling to process this tsunami of data are frustrated by the number and variety of analytics tools they are forced to use.6

Supporting all this activity, however—unnoticed and, says D’Arezzo, dangerously neglected—is the basic computational machinery itself. Data analytics requires a computer system to access multiple and often far-flung databases, pulling information together through millions of individual input-output (I/O) operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.

According to experts, the most widely used operating system, Microsoft Windows, is in many ways the least efficient. In any storage environment, from multi-cloud to a PC hard drive, Windows penalizes optimum performance because of server inefficiencies in the handoff of data to storage. This is a problem that, untreated, worsens with time. The average Windows-based system pays a 30 percent to 40 percent penalty in overall throughput capability because of I/O degradation.7

Continue Reading

Precision Medicine, Electronic Health Records Hinge On Data Storage and Analysis

By James D’Arezzo, CEO of Condusiv Technologies.

Jim D'Arezzo
Jim D’Arezzo

While advancements in data collection, analytics and electronic health records can lead to better healthcare, it also creates challenges for the healthcare sector’s IT departments. That’s because serious obstacles exist in terms of systems management and capacity. The sheer amount of healthcare data has skyrocketed from even a decade ago. Through advanced predictive analytics, this data can save lives by fostering the diagnosis, treatment and prevention of disease at a highly personalized level.

To maximize the benefits all of this information can offer, healthcare organizations will need to make significant investments in data storage and infrastructure. With simple software fixes, many healthcare IT departments could easily free up half their bandwidth — essentially doubling IT budgets — by more efficiently using the infrastructure already in place.

The health data tsunami

Healthcare institutions must comply with more than 629 different regulatory mandates in nine domains, costing the average community hospital between $7.6 and $9 million. Much of that spending is associated with meaningful use requirements –- government standards for how patient records and data are stored and transmitted. The average hospital spent $760,000 on meaningful-use requirements and invested an average of $411,000 in hardware and software upgrades for their records systems in 2016 alone

Because of the demands of healthcare record-keeping and continued advancements in medical technology, IT spending is rising exponentially. Along with that, medical research and development is booming to the point that institutions can’t keep up with the amount of data that needs to be stored and analyzed. Pharmaceutical and healthcare systems developers are also affected by the gap between data acquisition and analysis. Life sciences companies are launching products faster and in a greater number of therapy areas.

This fast-paced technological evolution places even more pressure on healthcare IT departments to deliver both innovation and efficiency.

I/O performance

Performance degradation occurs over time as the input/output (I/O) movement of data between the storage and computer/presentation layers declines. This degradation is particularly prevalent in the Windows environment. Luckily, targeted software solutions do exist that can improve system throughput by up to 50 percent without additional hardware.

If I/O drags, performance across the entire system slows, which primarily impacts computers running on Microsoft SQL servers (the most popular database in the world.) The Microsoft operating system is also notoriously inefficient with I/O. In fact, I/O degradation is much more common than most organizations realize. More than a quarter of organizations surveyed last year reported that poor performance from I/O-heavy applications was slowing systems down.

Continue Reading

Blockchain Not a Cure for EHR Interoperability

By James D’Arezzo, CEO, Condusiv Technologies.

Jim D'Arezzo
Jim D’Arezzo

Much has been written about the prospect of using blockchain technology as a key component of achieving EHR interoperability. It has been widely reported that 55 percent of surveyed hospitals indicated a desire to initiate some sort of blockchain program within the next two years.

However, as with many game-changing approaches, the devil is in the details. Blockchain technology presents a huge challenge when it comes to impact on the data center–whether that data center is on premises, in the cloud or a hybrid cloud configuration. The tsunami of data added to the already overwhelming amount of required information could swamp a healthcare organization. In addition, the performance decline from this mass of information may negate the positive aspects of using blockchain for EHR interoperability.

Let’s first look at the positives. Defined by the Office of the National Coordinator for Health Information Technology (ONC), interoperability under the health information technology arm includes three specific functions. First, it involves the secure exchange of electronic health information without special user effort. Second, it “allows for complete access, exchange, and use of all electronically accessible health information for authorized use under applicable state or federal law.” Third, it prohibits specific information blocking, the act of “knowingly and unreasonably” interfering with the exchange and use of electronic health information. If implemented properly, blockchain can meet these requirements exceptionally well.

Now for the challenge. Blockchain is slow. In the most recently available study, the Bitcoin network — the largest and most widely tested application of blockchain technology — achieved maximum throughput nearly 50 times slower than PayPal and 14,000 times slower than VisaNet. If blockchain-based applications come in on top of the already staggering load of data handling required of IT in the healthcare sector today, the danger of major system slowdowns, and even system crashes, will increase dramatically. In the healthcare environment, these IT disasters could have life-or-death consequences.

Continue Reading