Click here to close now.


News Feed Item

Heart Failure Study From the Mayo Clinic Utilizing Daxor's BVA-100 Blood Volume Analyzer Published in Journal of the American College of Cardiology - Heart Failure

NEW YORK, NY -- (Marketwired) -- 05/30/14 -- Daxor Corporation (NYSE MKT: DXR) -- The Journal of the American College of Cardiology - Heart Failure has published the first study measuring congestive heart failure patients' blood volumes at the initiation of treatment and just prior to discharge. The authors were Wayne L. Miller, MD, PhD, and Brian P. Mullan, MD. The Mayo Clinic's cardiology department is ranked #2 in the annual survey of U.S. hospitals for 2013 - 2014 according to the U.S. News and World Report.

This study involved 26 Class III/IV cardiac patients who are usually treated without blood volume measurements on the basis of clinical evaluations and tests such as hematocrits/hemoglobins, which measure the ratio of a patient's red cells but not the actual volume of a patient's blood. Retention of sodium and water and expansion of a patient's blood volume are the most fundamental derangements which occur in congestive heart failure patients. Treatment is usually based on clinical evaluations and other tests which do not measure the patient's blood volume.

The study was designed to quantitate total blood volume in patients hospitalized for decompensated heart failure and to determine the extent of volume overload and the magnitude and distribution of blood volume and water changes during diuretic therapy. Total blood volume analysis demonstrates a wide range in the extent of volume overload.

24 out of 26 patients were hypervolemic, with a range of +9.5% to +107% above the normal volume. By measuring blood volume at discharge, it was possible to compute whether the fluids removed were from the blood volume itself or from the interstitial fluid, which is the body fluid surrounding the cells of the body. On average 85% of the fluid removed came from the interstitial space. However, there was a wide range of variability within the patient population. The authors also noted that at discharge, despite vigorous therapy, most of the patients still had abnormally expanded blood volumes.

The authors noted "The extent of composition and distribution of volume overload are highly variable in decompensated congestive heart failure and this variability needs to be taken into account in the approach to individualized therapy. The authors further noted that "Utilizing current methods, the accurate assessment and management of volume overload in patients with decompensated heart failure remains problematic".

Congestive heart failure patients have a 30 to 40% death rate within one year of being admitted to a hospital for heart failure. The patients are routinely treated with powerful diuretic drugs and vasodilator medications to relax the blood vessels. Kidney failure is a frequent complication in this group of patients. This Mayo Clinic study was the first study to document the wide range of variability within individual patients with respect to the derangement of their total blood volume as well as their red cell volume.

The journal article was accompanied by an editorial by Dr. Stuart Katz, Helen L. and Martin S. Kimmel Professor of Advanced Cardiac Therapeutics; Dir NYULMC Heart Failure Program at New York University Medical Center. Dr. Katz was one of the senior authors of one of the first papers from Columbia Presbyterian Medical Center which documented that treatment which resulted in a patient having a normal blood volume (euvolemia) markedly improved the chance of survival of heart failure patients. Dr. Katz, in a detailed editorial, commented about the variability and heterogeneity of blood volume derangements in these patients. Dr. Katz concluded "Meanwhile, clinicians must recognize the limitations of physical assessment for the diagnosis of volume overload in heart failure patients, and should consider use of direct measurements of intravascular volume and/or intravascular pressures for better estimation of euvolemia as part of a therapeutic strategy to reduce the risk of adverse outcomes".

Heart failure patients constitute the greatest medical expense for hospitalized Medicare patients. Between 15 to 30% of such patients are readmitted within 30 days or less to the hospital. Medicare compensates hospitals on the basis of diagnostic related guidelines (DRGs) which means that hospitals receive a fixed cost for a specific condition such as heart failure whether the patient is in the hospital for 3 days or 15 days. There is a significant incentive to discharge patients as soon as possible. In response to the high percentage of readmission of heart failure patients, Medicare, in 2013, instituted new guidelines which penalize hospitals significantly for each patient readmitted within 30 days or less. Dr. Joseph Feldschuh, a cardiologist and the President of Daxor, noted that utilizing blood volume measurement during hospitalization and on an outpatient basis after hospitalization may decrease the necessity for repeat hospitalization by the application of more appropriate individualized therapy.

The article and the editorial were published online and will be available in the next hard copy issue of the journal.

Contact Information:
Daxor Corporation:
Richard Dunn
Director of Operations
Email Contact

Diane Meegan
Investor Relations
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect ...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.