Welcome!

News Feed Item

Classic Textbook on Critical Care, the ICU book by Paul Marino, M.D., PhD., Recommends Direct Measurement of Blood Volume in Evaluating Critically Ill Patients

NEW YORK, NY -- (Marketwired) -- 01/07/14 -- Daxor Corporation (NYSE MKT: DXR)

The most widely read textbook in critical care medicine is the ICU book (intensive care unit) by Paul Marino, M.D. The fourth edition of this book has just been published by Lippincott Williams & Wilkins, and has important information about blood volume measurement. Dr. Marino, who is on the staff of Cornell Medical School, is an internationally recognized authority on critical care medicine.

The previous third edition was published in 2007. The fourth edition has important new information about the use of blood volume measurement. In a chapter on hypovolemia (low blood volume) Dr. Marino writes "Blood volume measurements have traditionally required too much time to perform to be clinically useful in an ICU setting, but this has changed with the introduction of a semi-automated blood volume analyzer (Daxor Corporation, New York, NY) that provides blood volume measurements in less than an hour. Blinded measurements of blood, red cell, and plasma volumes were performed in patients with circulatory shock who were managed with pulmonary artery catheters, and the results show that blood and plasma volumes were considerably higher than normal. When blood volume measurements were made available for patient care, 53% of the measurements led to a change in fluid management, and this was associated with a significant decrease in mortality rate (from 24% to 8%). These results will require corroboration, but they highlight the limitations of the clinical assessment of blood volume, and the potential for improved outcomes when blood volume measurements are utilized for fluid management." Dr. Marino's book cited a study by Dr. Mihae Yu and included a graph of her research which was published in Shock (A Prospective Randomized Trial Using Blood Volume Analysis in Addition to Pulmonary Artery Catheter, Compared with Pulmonary Artery Catheter Alone, to Guide Shock Resuscitation in Critically Ill Surgical Patients; Shock, Vo. 35, No. 3, pp 220-228, 2011). This landmark study by Dr. Yu studied 100 critically ill patients in the ICU. 50 of them were treated on the basis of a blood volume measurement plus PAC, and 50 were treated on the basis of PAC without knowledge of the blood volume measurement. The patients who were treated on the basis of a blood volume measurement had an 8% survival rate vs. a 20% death rate in the patients who were treated without knowledge of the blood volume measurement.

The most common laboratory tests to evaluate a patient's blood volume are the hematocrit and hemoglobin tests. These tests only measure the concentration of red blood cells, not the volume of the patient's blood. Dr. Marino's book contains the following statement "The use of the hematocrit (and hemoglobin concentration) to evaluate the presence and severity of acute blood loss is both common and inappropriate. Changes in hematocrit show a poor correlation with blood volume deficits and erythrocyte deficits in acute hemorrhage. Acute blood loss involves the loss of whole blood, which results in proportional decreases in the volume of plasma and erythrocytes. As a result, acute blood loss results in a decrease in blood volume but not a decrease in hematocrit. (There is a small dilutional effect from transcapillary refill in acute blood loss, but this is usually not enough to cause a significant decrease in hematocrit.) In the absence of volume resuscitation, the hematocrit will eventually decrease because hypovolemia activates the rennin-angiotensin-aldosterone system, and the renal retention of sodium and water that follows will have a dilutional effect on the hematocrit. This process begins 8 to 12 hours after acute blood loss, and can take a few days to become fully established." That statement is based on original research by Drs. S. Oohashi and H. Endoh, who examined physicians' assessments of patients' blood volumes with actual blood volume measurements and found them to be extremely disparate. Previously, studies from Columbia Presbyterian Medical Center also demonstrated that physicians treating heart failure patients were only correct 51% of the time using the usual clinical parameters and laboratory tests in evaluating the blood volume status of a patient.

Dr. Marino's chapter focused particularly on the detection of hypovolemia (low blood volume). He cited studies which used invasive procedures such as pulmonary artery catheterization (PAC) and central venous pressure to assess a patient's blood volume. Previous studies have demonstrated, in situations where blood volume measurements were actually made, that such assessments are frequently wrong.

Dr. Feldschuh, the president of Daxor, a board certified cardiologist, noted that it is truly tragic that the overwhelming majority of patients treated in intensive care units are not treated based on actual blood volume measurements. Instead, they are treated on the basis of inaccurate tests such as hematocrits and hemoglobins, and invasive procedures such as pulmonary artery catheterization, which multiple studies have shown to be inaccurate for evaluating a patient's blood volume.

Dr. Feldschuh stated that it is unfortunate that thousands of patients die every year because they are not treated correctly due to inaccurate assessments of blood volume. These tests only measure the concentration of red blood cells in a patient, they do not measure a patient's total blood volume. The BVA-100 can measure blood volume to an accuracy of 98%.

Dr. Marino closed this particular chapter with the following comments: "The clinical evaluation of intravascular volume, including the use of central venous pressure (CVP) measurements, is so flawed it has been called a 'comedy of errors'" and "Direct measurements of blood volume are clinically feasible, but are underutilized."

The past week extensive publicity was given to the case of a 13 year old girl in California who had a tonsillectomy and was said to have bled extensively. Unfortunately her blood loss was not properly replaced and she was pronounced brain dead. This is the type of case that occurs by the thousands every year. In addition to patients who die from inadequate treatment, thousands more suffer heart attacks and strokes. Many patients with lack of brain oxygen ultimately develop dementia. Lack of oxygen to the brain is well known to destroy and irreversibly damage brain cells. The knowledge of how to perform accurate blood volume measurement has been known for more than 70 years. An automated blood volume measurement instrument has been available for more than 10 years. The excuse for not performing blood volume measurement has always been that the test is too difficult to perform accurately and takes 4 to 8 hours. The BVA-100 has automated most of the procedure and enables the blood volume measurement to be done in under an hour. Inability to obtain rapid blood volume measurements should no longer be an excuse for treating critically ill patients with blood volume derangements on the basis of tests which are known to be inaccurate and may result in irreversible damage and even death to patients.

Dr. Feldschuh will be attending the annual meeting of Critical Care Medicine in San Francisco January 8 - 13, 2014. This is the main annual meeting of physicians specializing in intensive care unit medicine.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
"We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software," explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Datanami has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datanami is a communication channel dedicated to providing insight, analysis and up-to-the-minute information about emerging trends and solutions in Big Data. The publication sheds light on all cutting-edge technologies including networking, storage and applications, and thei...