Welcome!

News Feed Item

Risk-based Monitoring: Industry Guidance on Adoption, Use, and Outsourcing

DUBLIN, January 31, 2014 /PRNewswire/ --

Dublin - Research and Markets (http://www.researchandmarkets.com/research/248gqk/riskbased) has announced the addition of the "Risk-based Monitoring: Industry Guidance on Adoption, Use, and Outsourcing" report to their offering.

     (Logo: http://photos.prnewswire.com/prnh/20130307/600769 )


ISR believes the rise of risk-based monitoring (RBM) is a result of several forces acting on the pharmaceutical industry. First, electronic data capture (EDC) technology is now the de facto standard for site/ patient data capture. Second, the patent cliff and declining R&D productivity rates have conspired to force pharmaceutical companies to look at ways to cut costs, while increasing efficiency. Third, regulators have begun to output guidance documents that center on alternative drug development models and processes (risk-based monitoring, adaptive trials, electronic data as source data), making it less risky for sponsors to employ these methodologies/ strategies.

This report offers the pharmaceutical and biotech industry, and its service providers peer-based guidance on the adoption, use, and outsourcing of risk-based monitoring.

The report was developed by surveying industry professionals with direct risk-based monitoring (RBM) experience and provides the pharmaceutical industry and its service providers with peer-based guidance to aid in the regulatory and expertise challenges the industry is currently facing.

While the report believes several industry factors are ultimately responsible for the continued momentum we've seen with risk-based approaches to clinical monitoring, respondents were very clear as to the main driver for using RBM: Reduced trial costs, explained Andrew Schafer, President of ISR. In light of regulatory uncertainty and a general lack of expertise, we anticipate the use of risk-based monitoring to rise in the future as sponsors look around the industry for best-practices and experts to assist them.

The report, which surveyed 78 industry professionals from sponsor organizations, aims to guide that uncertainty in the form of operational and strategic best practices. The report offers recommendations based on real-world lessons learned to anticipate both internal and external regulatory and operational hurdles, as well as identifies the studies, study characteristics, and service providers seen as "best fit" for RBM activities.

Believed to offer significant cost savings, it's no surprise that awareness of and interest in risk-based monitoring is high. Sponsors are, or should be, doing their due diligence on the topic to progress down the RBM path. Successful RBM implementation takes internal subject matter experts across many different disciplines within an organization and many are looking for CROs who have experience in not only operationally executing RBM studies, but for those who can/ will input on the design and strategies surrounding them.


For more information visit http://www.researchandmarkets.com/research/248gqk/riskbased


Media Contact: Laura Wood , +353-1-481-1716, [email protected]

SOURCE Research and Markets

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps ...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properl...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the ste...
Digital Transformation is well underway with many applications already on the cloud utilizing agile and devops methodologies. Unfortunately, application security has been an afterthought and data breaches have become a daily occurrence. Security is not one individual or one's team responsibility. Raphael Reich will introduce you to DevSecOps concepts and outline how to seamlessly interweave security principles across your software development lifecycle and application lifecycle management. With ...
The vast majority of businesses now use cloud services, yet many still struggle with realizing the full potential of their IT investments. In particular, small and medium-sized businesses (SMBs) lack the internal IT staff and expertise to fully move to and manage workloads in public cloud environments. Speaker Todd Schwartz will help session attendees better navigate the complex cloud market and maximize their technical investments. The SkyKick co-founder and co-CEO will share the biggest challe...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.