Click here to close now.


News Feed Item

Daxor to Explore Possible Partnership or Partial Sale of the Company

NEW YORK, NY -- (Marketwired) -- 06/19/14 -- Daxor Corporation (NYSE MKT: DXR) -- The Board of Directors of Daxor Corporation, at its June 18, 2014 meeting, renewed its annual stock buy-back program of up 250,000 shares.

The Board of Directors also voted to explore options for partnering with a much larger company to facilitate acceptance of its unique medical services and technology. The focus will be on a large company with experience in distribution and marketing. The company intends to engage an investment banker to expedite this process. As part of this process, the Board of Directors is also willing to consider a partial sale of the company.

The company remains committed to its fundamental goal of making its blood volume analyzer a standard of care in medical and surgical conditions. Daxor's BVA-100 is the only semi-automated instrument approved by the FDA for direct blood volume measurement.

Accurate knowledge of a patient's blood volume is essential to determining who should receive a transfusion and who should not receive a transfusion. For example, the President of Daxor, Dr. Joseph Feldschuh, fractured his hip on February 4, 2014 and had emergency surgery. He lost approximately 10 times as much blood as was estimated by the surgeon and anesthesiologist. The actual blood loss was documented by a subsequent blood volume measurement which demonstrated the true extent of the severe blood loss. Fortunately Dr. Feldschuh also had a unit of his own frozen blood available which had been collected four-and-a-half years previously and was administered shortly after the surgery when he experienced symptoms due to the blood loss.

Blood loss is just one example of where precise knowledge of a patient's blood volume is essential, but is unavailable in most hospitals in the United States. Congestive heart failure, the #1 expense for Medicare patients admitted to hospitals in the United States, is primarily a blood volume derangement. A study published from the Mayo Clinic in the Journal of the American College of Cardiology - Heart Failure documented that congestive heart failure patients are a specific group requiring significantly different modes of treatment depending on the underlying blood volume derangement. At the present time patients are not routinely treated utilizing blood volume measurements. The death rate for congestive heart failure patients is 30 to 40% within one year of hospitalization for heart failure.

Renal dialysis patients, patients with hypertension, syncopy patients (fainting), are common conditions where blood volume derangements are essential to understanding the pathology of the patients. Patients are routinely treated for these conditions on the basis of surrogate tests which have been shown to be inaccurate.

A fundamental goal of the company is to make blood volume measurement a standard of care so that patients are optimally treated, particularly in conditions with high death rates and permanent, severe complications.

The company has also pioneered the use of frozen autologous (self storage) blood. Such blood stored at super-low temperatures can be used for up 2 to 10 years after collection. The American Medical Association has stated that "the only safe blood is one's own blood." Despite the obvious benefits of such a program, this service has minimal utilization.

Additional information and analyst evaluation of Daxor is available on our website

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect ...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
DevOps is gaining traction in the federal government – and for good reasons. Heightened user expectations are pushing IT organizations to accelerate application development and support more innovation. At the same time, budgetary constraints require that agencies find ways to decrease the cost of developing, maintaining, and running applications. IT now faces a daunting task: do more and react faster than ever before – all with fewer resources.