Author granted license

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International

Document Type

Article

Publication Date

2013

Publisher

School of Law, Stanford University

Language

en-US

Abstract

"Big data" can be defined as a problem-solving philosophy that leverages massive data-sets and algorithmic analysis to extract "hidden information and surprising correlations." Not only does big data pose a threat to traditional notions of privacy, but it also compromises socially shared information. This point remains under appreciated because our so-called public disclosures are not nearly as public as courts and policymakers have argued — at least, not yet. That is subject to change once big data becomes user friendly.

Most social disclosures and details of our everyday lives are meant to be known only to a select group of people. Until now, technological constraints have favored that norm, limiting the circle of communication by imposing transaction costs — which can range from effort to money — onto prying eyes. Unfortunately, big data threatens to erode these structural protections, and the common law, which is the traditional legal regime for helping individuals seek redress for privacy harms, has some catching up to do.

To make our case that the legal community is under-theorizing the effect big data will have on an individual’s socialization and day-to-day activities, we will proceed in four steps. First, we explain why big data presents a bigger threat to social relationships than privacy advocates acknowledge, and construct a vivid hypothetical case that illustrates how democratized big data can turn seemingly harmless disclosures into potent privacy problems. Second, we argue that the harm democratized big data can inflict is exacerbated by decreasing privacy protections of a special kind — ever-diminishing "obscurity." Third, we show how central common law concepts might be threatened by eroding obscurity and the resulting difficulty individuals have gauging whether social disclosures in a big data context will sow the seeds of forthcoming injury. Finally, we suggest that one way to stop big data from causing big, un-redressed privacy problems is to update the common law with obscurity-sensitive considerations.

Find on SSRN

Included in

Law Commons

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.