Click here to close now.

Welcome!

News Feed Item

Heart Failure Study From the Mayo Clinic Utilizing Daxor's BVA-100 Blood Volume Analyzer Published in Journal of the American College of Cardiology - Heart Failure

NEW YORK, NY -- (Marketwired) -- 05/30/14 -- Daxor Corporation (NYSE MKT: DXR) -- The Journal of the American College of Cardiology - Heart Failure has published the first study measuring congestive heart failure patients' blood volumes at the initiation of treatment and just prior to discharge. The authors were Wayne L. Miller, MD, PhD, and Brian P. Mullan, MD. The Mayo Clinic's cardiology department is ranked #2 in the annual survey of U.S. hospitals for 2013 - 2014 according to the U.S. News and World Report.

This study involved 26 Class III/IV cardiac patients who are usually treated without blood volume measurements on the basis of clinical evaluations and tests such as hematocrits/hemoglobins, which measure the ratio of a patient's red cells but not the actual volume of a patient's blood. Retention of sodium and water and expansion of a patient's blood volume are the most fundamental derangements which occur in congestive heart failure patients. Treatment is usually based on clinical evaluations and other tests which do not measure the patient's blood volume.

The study was designed to quantitate total blood volume in patients hospitalized for decompensated heart failure and to determine the extent of volume overload and the magnitude and distribution of blood volume and water changes during diuretic therapy. Total blood volume analysis demonstrates a wide range in the extent of volume overload.

24 out of 26 patients were hypervolemic, with a range of +9.5% to +107% above the normal volume. By measuring blood volume at discharge, it was possible to compute whether the fluids removed were from the blood volume itself or from the interstitial fluid, which is the body fluid surrounding the cells of the body. On average 85% of the fluid removed came from the interstitial space. However, there was a wide range of variability within the patient population. The authors also noted that at discharge, despite vigorous therapy, most of the patients still had abnormally expanded blood volumes.

The authors noted "The extent of composition and distribution of volume overload are highly variable in decompensated congestive heart failure and this variability needs to be taken into account in the approach to individualized therapy. The authors further noted that "Utilizing current methods, the accurate assessment and management of volume overload in patients with decompensated heart failure remains problematic".

Congestive heart failure patients have a 30 to 40% death rate within one year of being admitted to a hospital for heart failure. The patients are routinely treated with powerful diuretic drugs and vasodilator medications to relax the blood vessels. Kidney failure is a frequent complication in this group of patients. This Mayo Clinic study was the first study to document the wide range of variability within individual patients with respect to the derangement of their total blood volume as well as their red cell volume.

The journal article was accompanied by an editorial by Dr. Stuart Katz, Helen L. and Martin S. Kimmel Professor of Advanced Cardiac Therapeutics; Dir NYULMC Heart Failure Program at New York University Medical Center. Dr. Katz was one of the senior authors of one of the first papers from Columbia Presbyterian Medical Center which documented that treatment which resulted in a patient having a normal blood volume (euvolemia) markedly improved the chance of survival of heart failure patients. Dr. Katz, in a detailed editorial, commented about the variability and heterogeneity of blood volume derangements in these patients. Dr. Katz concluded "Meanwhile, clinicians must recognize the limitations of physical assessment for the diagnosis of volume overload in heart failure patients, and should consider use of direct measurements of intravascular volume and/or intravascular pressures for better estimation of euvolemia as part of a therapeutic strategy to reduce the risk of adverse outcomes".

Heart failure patients constitute the greatest medical expense for hospitalized Medicare patients. Between 15 to 30% of such patients are readmitted within 30 days or less to the hospital. Medicare compensates hospitals on the basis of diagnostic related guidelines (DRGs) which means that hospitals receive a fixed cost for a specific condition such as heart failure whether the patient is in the hospital for 3 days or 15 days. There is a significant incentive to discharge patients as soon as possible. In response to the high percentage of readmission of heart failure patients, Medicare, in 2013, instituted new guidelines which penalize hospitals significantly for each patient readmitted within 30 days or less. Dr. Joseph Feldschuh, a cardiologist and the President of Daxor, noted that utilizing blood volume measurement during hospitalization and on an outpatient basis after hospitalization may decrease the necessity for repeat hospitalization by the application of more appropriate individualized therapy.

The article and the editorial were published online and will be available in the next hard copy issue of the journal.

Contact Information:
Daxor Corporation:
Richard Dunn
212-330-8502
Director of Operations
Email Contact

Diane Meegan
212-330-8512
Investor Relations
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult – let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and liv...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
"We have a tagline - "Power in the API Economy." What that means is everything that is built in applications and connected applications is done through APIs," explained Roberto Medrano, Executive Vice President at Akana, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can...
"A lot of the enterprises that have been using our systems for many years are reaching out to the cloud - the public cloud, the private cloud and hybrid," stated Reuven Harrison, CTO and Co-Founder of Tufin, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achi...
In the midst of the widespread popularity and adoption of cloud computing, it seems like everything is being offered “as a Service” these days: Infrastructure? Check. Platform? You bet. Software? Absolutely. Toaster? It’s only a matter of time. With service providers positioning vastly differing offerings under a generic “cloud” umbrella, it’s all too easy to get confused about what’s actually being offered. In his session at 16th Cloud Expo, Kevin Hazard, Director of Digital Content for SoftL...