Welcome!

News Feed Item

Consumer Watchdog Tells White House Team People Have Right To Control Data

Joins Other Public Interest Groups In Spelling Out Six Requirements For Big Data Policy

SANTA MONICA, Calif., April 1, 2014 /PRNewswire-USNewswire/ -- Consumer Watchdog has told the White House Team studying the Obama Administration's policy towards "Big Data" that "people must be able to know what information is gathered about them, how long it is kept and for what the information will be used."

Consumer Watchdog also joined 21 other public interest groups in a statement to the White House Big Data study group headed by John Podesta, Senior Counselor to the President and Nicole Wong, Deputy Chief Technology Officer, spelling out six key requirements for good Big Data policy.

"In the murky world of data brokers there is virtually no transparency," wrote John M. Simpson, Consumer Watchdog's Privacy Project director. "People don't know what digital dossiers have been assembled about them, what the data is used for or what decisions are being made about them without their knowledge."

Read Consumer Watchdog's comments here: http://www.consumerwatchdog.org/resources/whitehousebigdata033114.pdf

"We call on the Administration to introduce baseline privacy legislation and to implement the Consumer Privacy Bill of Rights," wrote Simpson. "You must protect a person's right to control whether data about him or her is collected and how it is used."

The comments from the 22-member coalition said that while Big Data can support commercial growth, government programs, and opportunities for innovation, it "creates new problems including pervasive surveillance; the collection, use, and retention of vast amounts of personal data; profiling and discrimination; and the very real risk that over time more decision-making about individuals will be automated, opaque, and unaccountable."

Here are the six requirements the 22 groups said must be included in the White House's final report on Big Data and the Future of Privacy:

TRANSPARENCY: Entities that collect personal information should be transparent about what information they collect, how they collect it, who will have access to it, and how it is intended to be used. Furthermore, the algorithms employed in Big Data should be made available to the public.

OVERSIGHT: Independent mechanisms should be put in place to assure the integrity of the data and the algorithms that analyze the data. These mechanisms should help ensure the accuracy and the fairness of the decision-making.

ACCOUNTABILITY: Entities that improperly use data or algorithms for profiling or discrimination should be held accountable. Individuals should have clear recourse to remedies to address unfair decisions about them using their data. They should be able to easily access and correct inaccurate information collected about them.

ROBUST PRIVACY TECHNIQUES: Techniques that help obtain the advantages of big data while minimizing privacy risks should be encouraged. But these techniques must be robust, scalable, provable, and practical. And solutions that may be many years into the future provide no practical benefit today.

MEANINGFUL EVALUATION: Entities that use big data should evaluate its usefulness on an ongoing basis and refrain from collecting and retaining data that is not necessary for its intended purpose. We have learned that the massive metadata program created by the NSA has played virtually no role in any significant terrorism investigation. We suspect this is true also for many other "Big Data" programs.

CONTROL: Individuals should be able to exercise control over the data they create or is associated with them, and decide whether the data should be collected and how it should be used if collected.

Read the public interest groups' joint comments here: http://privacycoalition.org/Big.Data.Coalition.Ltr.pdf

The 22 groups who signed the joint statement are: Advocacy for Principled Action in Government, American Association of Law Libraries, American Library Association, Association of Research Libraries, Bill of Rights Defense Committee, Center for Digital Democracy, Center for Effective Government, Center for Media Justice, Consumer Action, Consumer Federation of America, Consumer Task Force for Automotive Issues, Consumer Watchdog, Council for Responsible Genetics, Electronic Privacy Information Center (EPIC), Foolproof Initiative, OpenTheGovernment.org, National Center for Transgender Equality, Patient Privacy Rights PEN American Center, Privacy Journal, Privacy Rights Clearinghouse, Privacy Times, and Public Citizen, Inc.

Visit Consumer Watchdog's website at www.consumerwatchdog.org

SOURCE Consumer Watchdog

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at EMC, will introduce a methodology for capturing, enriching and sharing data (and analytics) across the organizati...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
SYS-CON Events announced today that Secure Channels will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The bedrock of Secure Channels Technology is a uniquely modified and enhanced process based on superencipherment. Superencipherment is the process of encrypting an already encrypted message one or more times, either using the same or a different algorithm.
The vision of a connected smart home is becoming reality with the application of integrated wireless technologies in devices and appliances. The use of standardized and TCP/IP networked wireless technologies in line-powered and battery operated sensors and controls has led to the adoption of radios in the 2.4GHz band, including Wi-Fi, BT/BLE and 802.15.4 applied ZigBee and Thread. This is driving the need for robust wireless coexistence for multiple radios to ensure throughput performance and th...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
SYS-CON Events announced today that China Unicom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE F...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.