|By PR Newswire||
|April 1, 2014 03:03 PM EDT||
SANTA MONICA, Calif., April 1, 2014 /PRNewswire-USNewswire/ -- Consumer Watchdog has told the White House Team studying the Obama Administration's policy towards "Big Data" that "people must be able to know what information is gathered about them, how long it is kept and for what the information will be used."
Consumer Watchdog also joined 21 other public interest groups in a statement to the White House Big Data study group headed by John Podesta, Senior Counselor to the President and Nicole Wong, Deputy Chief Technology Officer, spelling out six key requirements for good Big Data policy.
"In the murky world of data brokers there is virtually no transparency," wrote John M. Simpson, Consumer Watchdog's Privacy Project director. "People don't know what digital dossiers have been assembled about them, what the data is used for or what decisions are being made about them without their knowledge."
Read Consumer Watchdog's comments here: http://www.consumerwatchdog.org/resources/whitehousebigdata033114.pdf
"We call on the Administration to introduce baseline privacy legislation and to implement the Consumer Privacy Bill of Rights," wrote Simpson. "You must protect a person's right to control whether data about him or her is collected and how it is used."
The comments from the 22-member coalition said that while Big Data can support commercial growth, government programs, and opportunities for innovation, it "creates new problems including pervasive surveillance; the collection, use, and retention of vast amounts of personal data; profiling and discrimination; and the very real risk that over time more decision-making about individuals will be automated, opaque, and unaccountable."
Here are the six requirements the 22 groups said must be included in the White House's final report on Big Data and the Future of Privacy:
TRANSPARENCY: Entities that collect personal information should be transparent about what information they collect, how they collect it, who will have access to it, and how it is intended to be used. Furthermore, the algorithms employed in Big Data should be made available to the public.
OVERSIGHT: Independent mechanisms should be put in place to assure the integrity of the data and the algorithms that analyze the data. These mechanisms should help ensure the accuracy and the fairness of the decision-making.
ACCOUNTABILITY: Entities that improperly use data or algorithms for profiling or discrimination should be held accountable. Individuals should have clear recourse to remedies to address unfair decisions about them using their data. They should be able to easily access and correct inaccurate information collected about them.
ROBUST PRIVACY TECHNIQUES: Techniques that help obtain the advantages of big data while minimizing privacy risks should be encouraged. But these techniques must be robust, scalable, provable, and practical. And solutions that may be many years into the future provide no practical benefit today.
MEANINGFUL EVALUATION: Entities that use big data should evaluate its usefulness on an ongoing basis and refrain from collecting and retaining data that is not necessary for its intended purpose. We have learned that the massive metadata program created by the NSA has played virtually no role in any significant terrorism investigation. We suspect this is true also for many other "Big Data" programs.
CONTROL: Individuals should be able to exercise control over the data they create or is associated with them, and decide whether the data should be collected and how it should be used if collected.
Read the public interest groups' joint comments here: http://privacycoalition.org/Big.Data.Coalition.Ltr.pdf
The 22 groups who signed the joint statement are: Advocacy for Principled Action in Government, American Association of Law Libraries, American Library Association, Association of Research Libraries, Bill of Rights Defense Committee, Center for Digital Democracy, Center for Effective Government, Center for Media Justice, Consumer Action, Consumer Federation of America, Consumer Task Force for Automotive Issues, Consumer Watchdog, Council for Responsible Genetics, Electronic Privacy Information Center (EPIC), Foolproof Initiative, OpenTheGovernment.org, National Center for Transgender Equality, Patient Privacy Rights PEN American Center, Privacy Journal, Privacy Rights Clearinghouse, Privacy Times, and Public Citizen, Inc.
Visit Consumer Watchdog's website at www.consumerwatchdog.org
SOURCE Consumer Watchdog
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 26, 2016 05:00 AM EDT Reads: 3,007
Kubernetes, Docker and containers are changing the world, and how companies are deploying their software and running their infrastructure. With the shift in how applications are built and deployed, new challenges must be solved. In his session at @DevOpsSummit at19th Cloud Expo, Sebastian Scheele, co-founder of Loodse, will discuss the implications of containerized applications/infrastructures and their impact on the enterprise. In a real world example based on Kubernetes, he will show how to ...
Aug. 26, 2016 03:15 AM EDT Reads: 1,399
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 26, 2016 02:15 AM EDT Reads: 1,936
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 26, 2016 01:30 AM EDT Reads: 1,662
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Aug. 26, 2016 01:15 AM EDT Reads: 2,007
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 26, 2016 01:00 AM EDT Reads: 1,947
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 26, 2016 12:45 AM EDT Reads: 2,248
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 26, 2016 12:30 AM EDT Reads: 2,897
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 25, 2016 11:45 PM EDT Reads: 2,282
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Aug. 25, 2016 10:45 PM EDT Reads: 1,868
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 25, 2016 10:15 PM EDT Reads: 1,714
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 25, 2016 09:15 PM EDT Reads: 2,201
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Aug. 25, 2016 09:00 PM EDT Reads: 1,700
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Aug. 25, 2016 06:30 PM EDT Reads: 1,935
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 694