Welcome!

News Feed Item

21st Century National Pain Registry Could Change Culture and Practice of Pain Management

PHOENIX, March 11, 2014 /PRNewswire-USNewswire/ -- Based on early testing, the future looks bright for the creation of a technologically advanced national registry to collect data on the experience of pain sufferers and their responses to treatment, according to Stanford scientists who presented data at the 30th Annual Meeting of the American Academy of Pain Medicine (AAPM).

In a scientific session on Friday, Sean Mackey, MD, PhD, president of AAPM, explained how the Health Electronic Registry of Outcomes (HERO) system is designed to capture detailed, longitudinal patient-reported outcomes on physical, psychological and social health. The HERO system uses sophisticated algorithms to quickly assess a patient's condition and assign values to the individual contributors of pain. The system is designed around the knowledge that many factors contribute to an accurate pain assessment.  Some of those factors include pain intensity, how much pain interferes with activities, physical function, fatigue, sleep, mood and others.  The value to the pain practice and pain patient is dramatic, said Dr. Mackey, who is chief of the Pain Medicine Division at Stanford University School of Medicine in Palo Alto, Calif (http://snapl.stanford.edu.)

"We expect this work will ultimately help clinicians target tailored treatments to a specific patient," Dr. Mackey said.  "And we hope this work will help persons suffering from pain get the right treatment that is safe and effective for them to ultimately improve their quality of life."  

The work on HERO began with both philanthropic funding (Redlich Pain Research Endowment) to Dr. Mackey as well as a grant from the National Institutes for Health (NIH) Pain Consortium to Dr. Mackey and the Stanford Center for Clinical Informatics (SCCI). The goal has been to develop an open source health registry available on a national scale. "This is a perfect example of a public-private partnership that is working," Dr. Mackey said. "We are using both NIH resources as well as our own to develop a flexible system that will be freely available and usable for multiple pain and other medical conditions."  In late 2012, the Stanford Pain Management Center rolled out HERO and now has data on approximately 3,200 unique patients with over 8,000 longitudinal data assessments. Ming-Chih Kao, MD, PhD, assistant professor in the Stanford Pain Medicine Division, demonstrated several examples of how HERO has helped improve the quality of life for patients suffering with chronic pain as well as served as a platform to answer important pain research questions.

The need to improve patient outcome registries is one of the goals set by the Institute of Medicine in its 2011 report Relieving Pain in America: A Blueprint for Transforming Prevention, Care, Education, and Research. The IOM report documented more than 100 million Americans who suffer chronic pain at costs in medical expenses and lost productivity that reach up to $635 billion a year. To address IOM requirements, Dr. Mackey said, HERO is able to:

  • Support assessment of patients and treatment decision making at the point of care
  • Provide for the aggregation of large numbers of patients
  • Enable the assessment of the safety and effectiveness of therapies
  • Create "learning systems" that would provide clinicians with ongoing information about treatment success or failure

The idea is to remove technological barriers to collecting assessment and treatment data locally that can be rapidly aggregated and harmonized nationally, Dr. Mackey explained.

A frequent question that arises is why not use currently existing electronic medical records (EMRs)? The answer, Dr. Mackey said, is that current EMRs are not adequate to the complexity and demands of modern patient-reported outcomes. "Limiting our infrastructure to static or traditional forms would be like building a word processor that uses only the Courier New font," he said.

A key component of HERO is the NIH-funded PROMIS system, which stands for Patient Reported Outcome Measurement Information System. The NIH has invested more $100 million in developing the dynamic system of health-related measures since the project's inception in 2003, said Karon F. Cook, PhD, of the Feinberg School of Medicine, Northwestern University, Chicago, Ill., who also spoke to AAPM attendees.

In addition to Stanford and the NIH, the project's other collaborators include the University of Florida where testing is under way to integrate the new database with the current system of electronic medical records. 

About AAPM
The American Academy of Pain Medicine is the premier medical association for pain physicians and their treatment teams with over 2,500 members. Now in its 31st year of service, the Academy's mission is to optimize the health of patients in pain and eliminate pain as a major public health problem by advancing the practice and specialty of pain medicine through education, training, advocacy and research. Information is available on the Academy's website at www.painmed.org.

SOURCE American Academy of Pain Medicine

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
DevOps is not just last year’s buzzword. Companies with DevOps practices are 2.5x more likely to exceed profitability, market share, and productivity goals. But how do you enable high performance? What can you do right now to start? Find out from DevOps experts including Gene Kim, co-author of "The Phoenix Project," and the Dynatrace Center of Excellence.
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...