Welcome!

News Feed Item

Classic Textbook on Critical Care, the ICU book by Paul Marino, M.D., PhD., Recommends Direct Measurement of Blood Volume in Evaluating Critically Ill Patients

NEW YORK, NY -- (Marketwired) -- 01/07/14 -- Daxor Corporation (NYSE MKT: DXR)

The most widely read textbook in critical care medicine is the ICU book (intensive care unit) by Paul Marino, M.D. The fourth edition of this book has just been published by Lippincott Williams & Wilkins, and has important information about blood volume measurement. Dr. Marino, who is on the staff of Cornell Medical School, is an internationally recognized authority on critical care medicine.

The previous third edition was published in 2007. The fourth edition has important new information about the use of blood volume measurement. In a chapter on hypovolemia (low blood volume) Dr. Marino writes "Blood volume measurements have traditionally required too much time to perform to be clinically useful in an ICU setting, but this has changed with the introduction of a semi-automated blood volume analyzer (Daxor Corporation, New York, NY) that provides blood volume measurements in less than an hour. Blinded measurements of blood, red cell, and plasma volumes were performed in patients with circulatory shock who were managed with pulmonary artery catheters, and the results show that blood and plasma volumes were considerably higher than normal. When blood volume measurements were made available for patient care, 53% of the measurements led to a change in fluid management, and this was associated with a significant decrease in mortality rate (from 24% to 8%). These results will require corroboration, but they highlight the limitations of the clinical assessment of blood volume, and the potential for improved outcomes when blood volume measurements are utilized for fluid management." Dr. Marino's book cited a study by Dr. Mihae Yu and included a graph of her research which was published in Shock (A Prospective Randomized Trial Using Blood Volume Analysis in Addition to Pulmonary Artery Catheter, Compared with Pulmonary Artery Catheter Alone, to Guide Shock Resuscitation in Critically Ill Surgical Patients; Shock, Vo. 35, No. 3, pp 220-228, 2011). This landmark study by Dr. Yu studied 100 critically ill patients in the ICU. 50 of them were treated on the basis of a blood volume measurement plus PAC, and 50 were treated on the basis of PAC without knowledge of the blood volume measurement. The patients who were treated on the basis of a blood volume measurement had an 8% survival rate vs. a 20% death rate in the patients who were treated without knowledge of the blood volume measurement.

The most common laboratory tests to evaluate a patient's blood volume are the hematocrit and hemoglobin tests. These tests only measure the concentration of red blood cells, not the volume of the patient's blood. Dr. Marino's book contains the following statement "The use of the hematocrit (and hemoglobin concentration) to evaluate the presence and severity of acute blood loss is both common and inappropriate. Changes in hematocrit show a poor correlation with blood volume deficits and erythrocyte deficits in acute hemorrhage. Acute blood loss involves the loss of whole blood, which results in proportional decreases in the volume of plasma and erythrocytes. As a result, acute blood loss results in a decrease in blood volume but not a decrease in hematocrit. (There is a small dilutional effect from transcapillary refill in acute blood loss, but this is usually not enough to cause a significant decrease in hematocrit.) In the absence of volume resuscitation, the hematocrit will eventually decrease because hypovolemia activates the rennin-angiotensin-aldosterone system, and the renal retention of sodium and water that follows will have a dilutional effect on the hematocrit. This process begins 8 to 12 hours after acute blood loss, and can take a few days to become fully established." That statement is based on original research by Drs. S. Oohashi and H. Endoh, who examined physicians' assessments of patients' blood volumes with actual blood volume measurements and found them to be extremely disparate. Previously, studies from Columbia Presbyterian Medical Center also demonstrated that physicians treating heart failure patients were only correct 51% of the time using the usual clinical parameters and laboratory tests in evaluating the blood volume status of a patient.

Dr. Marino's chapter focused particularly on the detection of hypovolemia (low blood volume). He cited studies which used invasive procedures such as pulmonary artery catheterization (PAC) and central venous pressure to assess a patient's blood volume. Previous studies have demonstrated, in situations where blood volume measurements were actually made, that such assessments are frequently wrong.

Dr. Feldschuh, the president of Daxor, a board certified cardiologist, noted that it is truly tragic that the overwhelming majority of patients treated in intensive care units are not treated based on actual blood volume measurements. Instead, they are treated on the basis of inaccurate tests such as hematocrits and hemoglobins, and invasive procedures such as pulmonary artery catheterization, which multiple studies have shown to be inaccurate for evaluating a patient's blood volume.

Dr. Feldschuh stated that it is unfortunate that thousands of patients die every year because they are not treated correctly due to inaccurate assessments of blood volume. These tests only measure the concentration of red blood cells in a patient, they do not measure a patient's total blood volume. The BVA-100 can measure blood volume to an accuracy of 98%.

Dr. Marino closed this particular chapter with the following comments: "The clinical evaluation of intravascular volume, including the use of central venous pressure (CVP) measurements, is so flawed it has been called a 'comedy of errors'" and "Direct measurements of blood volume are clinically feasible, but are underutilized."

The past week extensive publicity was given to the case of a 13 year old girl in California who had a tonsillectomy and was said to have bled extensively. Unfortunately her blood loss was not properly replaced and she was pronounced brain dead. This is the type of case that occurs by the thousands every year. In addition to patients who die from inadequate treatment, thousands more suffer heart attacks and strokes. Many patients with lack of brain oxygen ultimately develop dementia. Lack of oxygen to the brain is well known to destroy and irreversibly damage brain cells. The knowledge of how to perform accurate blood volume measurement has been known for more than 70 years. An automated blood volume measurement instrument has been available for more than 10 years. The excuse for not performing blood volume measurement has always been that the test is too difficult to perform accurately and takes 4 to 8 hours. The BVA-100 has automated most of the procedure and enables the blood volume measurement to be done in under an hour. Inability to obtain rapid blood volume measurements should no longer be an excuse for treating critically ill patients with blood volume derangements on the basis of tests which are known to be inaccurate and may result in irreversible damage and even death to patients.

Dr. Feldschuh will be attending the annual meeting of Critical Care Medicine in San Francisco January 8 - 13, 2014. This is the main annual meeting of physicians specializing in intensive care unit medicine.

Contact Information:
Daxor Corporation:
Diane Meegan
212-330-8512
(Investor Relations)
[email protected]

Richard Dunn
212-330-8502
(Director of Operations)
[email protected]

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.