Welcome!

News Feed Item

Heart Failure Study From the Mayo Clinic Utilizing Daxor's BVA-100 Blood Volume Analyzer Published in Journal of the American College of Cardiology - Heart Failure

NEW YORK, NY -- (Marketwired) -- 05/30/14 -- Daxor Corporation (NYSE MKT: DXR) -- The Journal of the American College of Cardiology - Heart Failure has published the first study measuring congestive heart failure patients' blood volumes at the initiation of treatment and just prior to discharge. The authors were Wayne L. Miller, MD, PhD, and Brian P. Mullan, MD. The Mayo Clinic's cardiology department is ranked #2 in the annual survey of U.S. hospitals for 2013 - 2014 according to the U.S. News and World Report.

This study involved 26 Class III/IV cardiac patients who are usually treated without blood volume measurements on the basis of clinical evaluations and tests such as hematocrits/hemoglobins, which measure the ratio of a patient's red cells but not the actual volume of a patient's blood. Retention of sodium and water and expansion of a patient's blood volume are the most fundamental derangements which occur in congestive heart failure patients. Treatment is usually based on clinical evaluations and other tests which do not measure the patient's blood volume.

The study was designed to quantitate total blood volume in patients hospitalized for decompensated heart failure and to determine the extent of volume overload and the magnitude and distribution of blood volume and water changes during diuretic therapy. Total blood volume analysis demonstrates a wide range in the extent of volume overload.

24 out of 26 patients were hypervolemic, with a range of +9.5% to +107% above the normal volume. By measuring blood volume at discharge, it was possible to compute whether the fluids removed were from the blood volume itself or from the interstitial fluid, which is the body fluid surrounding the cells of the body. On average 85% of the fluid removed came from the interstitial space. However, there was a wide range of variability within the patient population. The authors also noted that at discharge, despite vigorous therapy, most of the patients still had abnormally expanded blood volumes.

The authors noted "The extent of composition and distribution of volume overload are highly variable in decompensated congestive heart failure and this variability needs to be taken into account in the approach to individualized therapy. The authors further noted that "Utilizing current methods, the accurate assessment and management of volume overload in patients with decompensated heart failure remains problematic".

Congestive heart failure patients have a 30 to 40% death rate within one year of being admitted to a hospital for heart failure. The patients are routinely treated with powerful diuretic drugs and vasodilator medications to relax the blood vessels. Kidney failure is a frequent complication in this group of patients. This Mayo Clinic study was the first study to document the wide range of variability within individual patients with respect to the derangement of their total blood volume as well as their red cell volume.

The journal article was accompanied by an editorial by Dr. Stuart Katz, Helen L. and Martin S. Kimmel Professor of Advanced Cardiac Therapeutics; Dir NYULMC Heart Failure Program at New York University Medical Center. Dr. Katz was one of the senior authors of one of the first papers from Columbia Presbyterian Medical Center which documented that treatment which resulted in a patient having a normal blood volume (euvolemia) markedly improved the chance of survival of heart failure patients. Dr. Katz, in a detailed editorial, commented about the variability and heterogeneity of blood volume derangements in these patients. Dr. Katz concluded "Meanwhile, clinicians must recognize the limitations of physical assessment for the diagnosis of volume overload in heart failure patients, and should consider use of direct measurements of intravascular volume and/or intravascular pressures for better estimation of euvolemia as part of a therapeutic strategy to reduce the risk of adverse outcomes".

Heart failure patients constitute the greatest medical expense for hospitalized Medicare patients. Between 15 to 30% of such patients are readmitted within 30 days or less to the hospital. Medicare compensates hospitals on the basis of diagnostic related guidelines (DRGs) which means that hospitals receive a fixed cost for a specific condition such as heart failure whether the patient is in the hospital for 3 days or 15 days. There is a significant incentive to discharge patients as soon as possible. In response to the high percentage of readmission of heart failure patients, Medicare, in 2013, instituted new guidelines which penalize hospitals significantly for each patient readmitted within 30 days or less. Dr. Joseph Feldschuh, a cardiologist and the President of Daxor, noted that utilizing blood volume measurement during hospitalization and on an outpatient basis after hospitalization may decrease the necessity for repeat hospitalization by the application of more appropriate individualized therapy.

The article and the editorial were published online and will be available in the next hard copy issue of the journal.

Contact Information:
Daxor Corporation:
Richard Dunn
212-330-8502
Director of Operations
Email Contact

Diane Meegan
212-330-8512
Investor Relations
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...