Welcome!

News Feed Item

The Overtreatment Conundrum: Definiens Identifies Three Steps for Advancing Diagnostics to Reduce Unnecessary Cancer Treatment and Improve Patient Health

Definiens, the global leader in Tissue Phenomics™ for diagnostics development in oncology, today released tips for the development of diagnostics to help physicians make more informed decisions, reduce unnecessary treatment and improve overall patient health. As the global cost of cancer care continues to rise exponentially, tests based on data generated from the analysis of tissue samples can play a critical role in helping improve stratification to provide patients with effective personalized care.

The overtreatment of cancer represents a significant and complex issue from both a health economics and patient care perspective. Studies published in the New England Journal of Medicine report “the annual direct costs of cancer care are projected to rise from $104 billion in 2006 to over $173 billion in 2020.” The problem, however, is that the current clinical and pathological parameters for diagnosing many cancer-types fail to accurately segment individual patients into specific risk groups. Many patients with low to intermediate stage cancers often receive the same level of aggressive treatment administered to patients with advanced, late-stage cancers. The result? Much of the money is spent on treatments that are unnecessary; leaving many patients needlessly exposed to potentially harmful side effects and long-term health consequences.

“The pervasive problem of cancer overtreatment highlights the urgent need for tests that can improve stratification in order to accurately identify those patient groups that require aggressive treatment, while allowing clinicians to separate out low-risk patients that will benefit from active surveillance,” said Dr. Ralf Huss, Chief Medical Officer of Definiens. “Definiens Tissue Phenomics™ solutions provides researchers with the necessary tools to extract, analyze and correlate all relevant data from heterogeneous tissue samples in order to develop innovative diagnostic tests with the capacity to inform better treatment decisions, lower healthcare costs and meet the needs of individual patients.”

Definiens’ three steps for the development of diagnostics that can reduce unnecessary cancer treatment and improve patient health include:

1. Recognize the Importance of Analyzing Intact Tissue

Over the past decade, advancements in next generation sequencing have proven essential for the molecular subgrouping of patients on the basis of genomic data. However, because there is only partial correlation between gene mutations, gene expression and actual tumor behavior, genomic information alone only provides a first-stage projection of cancer development. Conversely, focusing on the data that resides in intact tissue samples, including the number, shape, size and morphology of different structures and cells and proteins active in and around the tumor, will allow researchers to derive a more precise understanding of tumor biology and identify more effective biomarkers as result.

2. Focus on the Entire Tumor Microenvironment

Among researchers, a focus on genomics has often led to a preoccupation with the tumor itself. However, recent advances in immunotherapy indicate that the development of effective tests that have the ability to determine the likelihood of tumor recurrence and long-term treatment success require image analysis of the entire tumor microenvironment. This includes an understanding of the ways in which cancer cells change and grow in relation to surrounding cellular tissue. Looking at the larger context in which the tumor exists can better inform treatment decisions by providing invaluable insights regarding the probability of cancer progression.

3. Take a Big Data Approach to Tissue Analysis

To be sure, the data that resides in tissue samples is both vast and heterogeneous. Consequently, it is also far too complex and voluminous for human analysis alone. Automated image analysis and data mining that can fully quantify and extract all of the data that resides in tissue samples are critical for the development of tests with real prognostic value. In this regard, a big data approach that allows researchers to correlate information from multiple sources, including complex tissue signatures and genomic profiles, can provide the comprehensive picture needed for superior personalized and targeted treatment decisions.

About Definiens

Definiens is the global leader in Tissue Phenomics™ for discovery and diagnostics development in oncology and provides image analysis solutions for life sciences. Definiens' technology provides detailed tissue biomarker readouts from slide images and enables the correlation of this information with other key clinical or genomic information, an approach known as Tissue Phenomics™. Definiens helps pharmaceutical and biotechnology companies, research institutions, clinical service organizations and pathologists to generate new knowledge and support better decisions in research, diagnostics and therapy.

Definiens’ vision is to open new fields of research, to contribute to development of personalized medicine, and to significantly improve the quality of patients' lives. Definiens is headquartered in Munich, Germany and has its North American headquarters in Carlsbad, CA. Further information is available at www.definiens.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properl...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
By 2021, 500 million sensors are set to be deployed worldwide, nearly 40x as many as exist today. In order to scale fast and keep pace with industry growth, the team at Unacast turned to the public cloud to build the world's largest location data platform with optimal scalability, minimal DevOps, and maximum flexibility. Drawing from his experience with the Google Cloud Platform, VP of Engineering Andreas Heim will speak to the architecture of Unacast's platform and developer-focused processes.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
The vast majority of businesses now use cloud services, yet many still struggle with realizing the full potential of their IT investments. In particular, small and medium-sized businesses (SMBs) lack the internal IT staff and expertise to fully move to and manage workloads in public cloud environments. Speaker Todd Schwartz will help session attendees better navigate the complex cloud market and maximize their technical investments. The SkyKick co-founder and co-CEO will share the biggest challe...