Welcome!

Blog Feed Post

McGeveran’s Law of Friction

William McGeveran [twitter:BillMcGev] has written an article for University of Minnesota Law School that suggests how to make “frictionless sharing” well-behaved. He defines frictionless sharing as “disclosing “individuals’ activities automatically, rather than waiting for them to authorize a particular disclosure.” For example:

…mainstream news websites, including the Washington Post, offer “social reading” applications (“apps”) in Facebook. After a one- time authorization, these apps send routine messages through Facebook to users’ friends identifying articles the users view.

Bill’s article considers the pros and cons:

Social media confers considerable advantages on individuals, their friends, and, of course, intermediaries like Spotify and Facebook. But many implementations of frictionless architecture have gone too far, potentially invading privacy and drowning useful information in a tide of meaningless spam.

Bill is not trying to build walls. “The key to online disclosures … turns out to be the correct amount of friction, not its elimination.” To assess what constitutes “the correct amount” he offers an heuristic, which I am happy to call McGeveran’s Law of Friction: “It should not be easier to ‘share’ an action online than to do it.” (Bill does not suggest naming the law after him! He is a modest fellow.)

One of the problems with the unintentional sharing of information are “misclosures,” a term he attributes to Kelly Caine.

Frictionless sharing makes misclosures more likely because it removes practical obscurity on which people have implicitly relied when assessing the likely audience that would find out about their activities. In other words, frictionless sharing can wrench individuals’ actions from one context to another, undermining their privacy expectations in the process.

Not only does this reveal, say, that you’ve been watching Yoga for Health: Depression and Gastrointestinal Problems (to use an example from Sen. Franken that Bill cites), it reveals that fact to your most intimate friends and family. (In my case, the relevant example would be The Amazing Race, by far the worst TV I watch, but I only do it when I’m looking for background noise while doing something else. I swear!) Worse, says Bill, “preference falsification” — our desire to have our known preferences support our social image — can alter our tastes, leading to more conformity and less diversity in our media diets.

Bill points to other problems with making social sharing frictionless, including reducing the quality of information that scrolls past us, turning what could be a useful set of recommendations from friends into little more than spam: “…friends who choose to look at an article because I glanced at it for 15 seconds probably do not discover hidden gems as a result.”

Bill’s aim is to protect the value of intentionally shared information; he is not a hoarder. McGeveran’s Law thus tries to add in enough friction that sharing is intentional, but not so much that it gets in the way of that intention. For example, he asks us to imagine Netflix presenting the user with two buttons: “Play” and “Play and Share.” Sharing thus would require exactly as much work as playing, thus satisfying McGeveran’s Law. But having only a “Play” button that then automatically shares the fact that you just watched Dumb and Dumberer distinctly fails the Law because it does not “secure genuine consent.” As Bill points out, his Law of Friction is tied to the technology in use, and thus is flexible enough to be useful even as the technology and its user interfaces change.

I like it.

Read the original blog entry...

More Stories By David Weinberger

David is the author of JOHO the blog (www.hyperorg.com/blogger). He is an independent marketing consultant and a frequent speaker at various conferences. "All I can promise is that I will be honest with you and never write something I don't believe in because someone is paying me as part of a relationship you don't know about. Put differently: All I'll hide are the irrelevancies."

Latest Stories
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Transformation Abstract Encryption and privacy in the cloud is a daunting yet essential task for both security practitioners and application developers, especially as applications continue moving to the cloud at an exponential rate. What are some best practices and processes for enterprises to follow that balance both security and ease of use requirements? What technologies are available to empower enterprises with code, data and key protection from cloud providers, system administrators, inside...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...