Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents

Berkeley Technology Law Journal(2013)

引用 71|浏览33
暂无评分
摘要
Regulators here and abroad have embraced “privacy by design” as a critical element of their ongoing revision of current privacy laws. The underlying idea is to “build in” privacy— in the form of Fair Information Practices or (“FIPs”)—when creating software products and services. But FIPs are not self-executing. Rather, privacy by design requires the translation of FIPs into engineering and usability principles and practices. The best way to ensure that software includes the broad goals of privacy as described in the FIPs and any related corporate privacy guidelines is by including them in the definition of software “requirements.” And a main component of making a specification or requirement for software design is to make it concrete, specific, and preferably associated with a metric. Equally important is developing software interfaces and other visual elements that are focused around end-user goals, needs, wants, and constraints. This Article offers the first comprehensive analysis of engineering and usability principles specifically relevant to privacy. Based on a review of the technical literature, it derives a small number of relevant principles and illustrates them by reference to ten recent privacy incidents involving Google and Facebook. Part I of this Article analyzes the prerequisites for undertaking a counterfactual analysis of these ten incidents. Part II presents a general review of the design principles relevant to privacy. Part III turns to ten case studies of Google and Facebook privacy incidents, relying on the principles identified in Part II to discover what went wrong and what the two companies might have done differently to avoid privacy violations and consumer harms. Part IV of the Article concludes by arguing that all ten privacy incidents might have been avoided by the application of the privacy engineering and usability principles identified herein. Further, we suggest that the main challenge to effective privacy by design is not the lack of design guidelines. Rather, it is that business concerns often compete with and overshadow privacy concerns. Hence the solution lies in providing firms with much clearer guidance about applicable design principles and how best to incorporate them into their software development processes. Regulators should provide greater guidance on how to balance privacy with business interests, along with appropriate oversight mechanisms. © 2013 Ira S. Rubinstein & Nathaniel Good. † Adjunct Professor of Law and Senior Fellow, Information Law Institute, New York University School of Law. †† Principal, Good Research LLC. This Article was presented at the NYU Privacy Research Group and at the 2012 Privacy Law Scholars Conference, and we are grateful for the comments of workshop participants. Ron Lee, Paul Schwartz, and Tal Zarsky provided valuable suggestions on an earlier draft. Thanks are also due to Jeramie Scott and Mangesh Kulkarni for excellent research assistance and to Tim Huang for his help with citations. A grant from The Privacy Projects supported this work. 1334 BERKELEY TECHNOLOGY LAW JOURNAL [Vol. 28:1333
更多
查看译文
关键词
usability,privacy,fair information practices,privacy by design
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要