Welcome!

News Feed Item

New Business Coalition For Implementation & Reform Of The Death Master File Announced

WASHINGTON, March 5, 2014 /PRNewswire-USNewswire/ -- The creation of a new coalition to address the implementation of a new law governing the access to and use of the Social Security Death Master File (DMF) was announced yesterday.  The Coalition for Implementation and Reform of the Death Master File will be comprised primarily of financial services companies, including insurance companies, annuity companies, pensions, credit rating agencies, charitable organizations and others that utilize the DMF in the course of their business and to provide consumer financial services, as well as third-party vendors that access the DMF on behalf of these legitimate users of the DMF.

The Bipartisan Budget Act of 2013 (Pub. L. No. 113-67) directed the U.S. Department of Commerce to implement the new law governing access and use of the DMF, which has existed since the 1930's and provides basic information about the deaths of American citizens, including Social Security number, name and dates of birth and death.

Responding to abuses of the DMF that resulted in identity theft and fraud, Congress established a new program to limit such abuses by restricting immediate access to the DMF by the general public.  Recognizing, however, that the DMF is lawfully used extensively by business financial and charitable communities, and that the DMF drives or protects trillions of dollars in the U.S. economy, the new law will permit immediate access to the DMF to legitimate users, who must be certified pursuant to a program established by the U.S. Department of Commerce.

The Department of Commerce has announced it will be holding a public meeting on Tuesday, March 4, 2014 for interested parties to submit comments on the implementation and parameters of the certification program.  The Department's notice includes a comprehensive list of suggested questions for interested parties to address in order to help the Department in implementing the new law.

The Coalition was established to efficiently and effectively communicate and advocate the common interests of legitimate users of the DMF in order to ensure their continued access to quality information from the DMF.  Recognizing the benefits of speaking in a unified and coordinated voice, the Coalition has released its draft Statement of Principles as follows:

The Coalition seeks to work effectively with Congress and the U.S. Department Commerce to effectuate the new law, Title II of Section 203 of the Bipartisan Budget Act of 2013, governing access to the Death Master File (DMF) to:

  • ensure continuity of access to the DMF to certified users during the process of developing and adopting regulations;
  • specifically define in regulations certified users to include all those who meet the requirements for such designation, including third-party vendors servicing legitimate users;
  • ensure protection of DMF data against identity theft and fraud;
  • ensure reasonable operational criteria for certified users; and
  • ensure the highest quality of DMF data for certified users.

The Coalition will be managed by the American Continental Group (ACG), a leading, bipartisan public policy advocacy firm in Washington, D.C. that has worked to help businesses, trade associations, public institutions and nonprofit organizations navigate the federal legislative and regulatory processes for 20 years. ACG has been engaged for the past two and half years in Congress' consideration of legislation to restrict access to the Death Master File, including assisting in the development of needed legislative history when the DMF provisions were prematurely added to the Bipartisan Budget Act of 2013.

For more information regarding ACG, please visit their website at http://www.acg-consultants.com/.To join the Coalition or to obtain more information regarding the Coalition's activities, please contact Brian Fitzgerald at ACG at [email protected] or (202) 327-8107.

SOURCE The Coalition for Implementation and Reform of the Death Master File

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Excitement and interest in APIs has skyrocketed in recent years. However, if you ask a room full of IT professionals "What is an API", you will get a wide array of answers. There exists a wide knowledge gap between API experts and those that have a general idea of what they are, but are unsure of what they have been for in the past, what they look like now, and how they can be used to expand your business in the future. In this session John will cover what the history of APIs, what an API looks ...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOps with IBMz? You heard right. Maybe you're wondering what a developer can do to speed up the entire development cycle--coding, testing, source code management, and deployment-? In this session you will learn about how to integrate z application assets into a DevOps pipeline using familiar tools like Jenkins and UrbanCode Deploy, plus z/OSMF workflows, all of which can increase deployment speeds while simultaneously improving reliability. You will also learn how to provision mainframe syste...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...