Click here to close now.




















Welcome!

News Feed Item

21st Century National Pain Registry Could Change Culture and Practice of Pain Management

PHOENIX, March 11, 2014 /PRNewswire-USNewswire/ -- Based on early testing, the future looks bright for the creation of a technologically advanced national registry to collect data on the experience of pain sufferers and their responses to treatment, according to Stanford scientists who presented data at the 30th Annual Meeting of the American Academy of Pain Medicine (AAPM).

In a scientific session on Friday, Sean Mackey, MD, PhD, president of AAPM, explained how the Health Electronic Registry of Outcomes (HERO) system is designed to capture detailed, longitudinal patient-reported outcomes on physical, psychological and social health. The HERO system uses sophisticated algorithms to quickly assess a patient's condition and assign values to the individual contributors of pain. The system is designed around the knowledge that many factors contribute to an accurate pain assessment.  Some of those factors include pain intensity, how much pain interferes with activities, physical function, fatigue, sleep, mood and others.  The value to the pain practice and pain patient is dramatic, said Dr. Mackey, who is chief of the Pain Medicine Division at Stanford University School of Medicine in Palo Alto, Calif (http://snapl.stanford.edu.)

"We expect this work will ultimately help clinicians target tailored treatments to a specific patient," Dr. Mackey said.  "And we hope this work will help persons suffering from pain get the right treatment that is safe and effective for them to ultimately improve their quality of life."  

The work on HERO began with both philanthropic funding (Redlich Pain Research Endowment) to Dr. Mackey as well as a grant from the National Institutes for Health (NIH) Pain Consortium to Dr. Mackey and the Stanford Center for Clinical Informatics (SCCI). The goal has been to develop an open source health registry available on a national scale. "This is a perfect example of a public-private partnership that is working," Dr. Mackey said. "We are using both NIH resources as well as our own to develop a flexible system that will be freely available and usable for multiple pain and other medical conditions."  In late 2012, the Stanford Pain Management Center rolled out HERO and now has data on approximately 3,200 unique patients with over 8,000 longitudinal data assessments. Ming-Chih Kao, MD, PhD, assistant professor in the Stanford Pain Medicine Division, demonstrated several examples of how HERO has helped improve the quality of life for patients suffering with chronic pain as well as served as a platform to answer important pain research questions.

The need to improve patient outcome registries is one of the goals set by the Institute of Medicine in its 2011 report Relieving Pain in America: A Blueprint for Transforming Prevention, Care, Education, and Research. The IOM report documented more than 100 million Americans who suffer chronic pain at costs in medical expenses and lost productivity that reach up to $635 billion a year. To address IOM requirements, Dr. Mackey said, HERO is able to:

  • Support assessment of patients and treatment decision making at the point of care
  • Provide for the aggregation of large numbers of patients
  • Enable the assessment of the safety and effectiveness of therapies
  • Create "learning systems" that would provide clinicians with ongoing information about treatment success or failure

The idea is to remove technological barriers to collecting assessment and treatment data locally that can be rapidly aggregated and harmonized nationally, Dr. Mackey explained.

A frequent question that arises is why not use currently existing electronic medical records (EMRs)? The answer, Dr. Mackey said, is that current EMRs are not adequate to the complexity and demands of modern patient-reported outcomes. "Limiting our infrastructure to static or traditional forms would be like building a word processor that uses only the Courier New font," he said.

A key component of HERO is the NIH-funded PROMIS system, which stands for Patient Reported Outcome Measurement Information System. The NIH has invested more $100 million in developing the dynamic system of health-related measures since the project's inception in 2003, said Karon F. Cook, PhD, of the Feinberg School of Medicine, Northwestern University, Chicago, Ill., who also spoke to AAPM attendees.

In addition to Stanford and the NIH, the project's other collaborators include the University of Florida where testing is under way to integrate the new database with the current system of electronic medical records. 

About AAPM
The American Academy of Pain Medicine is the premier medical association for pain physicians and their treatment teams with over 2,500 members. Now in its 31st year of service, the Academy's mission is to optimize the health of patients in pain and eliminate pain as a major public health problem by advancing the practice and specialty of pain medicine through education, training, advocacy and research. Information is available on the Academy's website at www.painmed.org.

SOURCE American Academy of Pain Medicine

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducte...
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating...
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage co...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
The Cloud industry has moved from being more than just being able to provide infrastructure and management services on the Cloud. Enter a new era of Cloud computing where monetization’s services through the Cloud are an essential piece of strategy to feed your organizations bottom-line, your revenue and Profitability. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed how to easily o...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...