AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
Big Data presents many challenges for privacy, to which this Article posits a model of procedural data due process as a response

Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms

Boston College Law Review, no. 1 (2013): 93

Cited by: 341|Views16
Full Text
Bibtex
Weibo

Abstract

INTRODUCTIONBig Data analytics have been widely publicized in recent years, with many in the business and science worlds focusing on how large datasets can offer new insights into previously intractable problems.1 At the same time, Big Data poses new challenges for privacy advocates. Unlike previous computational models that exploited kno...More

Code:

Data:

Introduction
  • Big Data analytics have been widely publicized in recent years, with many in the business and science worlds focusing on how large datasets can [Vol 55:93 offer new insights into previously intractable problems.[1].
  • 54 Id. 55 Narges Bani Asadi, The Personalized Medicine Revolution Is Almost Here, VENTUREBEAT (Jan. 27, 2013, 12:23 PM), http://venturebeat.com/2013/01/27/the-personalized-medicine-revolutionis-almost-here/, archived at http://perma.cc/LJP4-ZL75; Press Release, Dep’t for Bus., Innovation, & Skills and Prime Minister’s Office, £30 Million Investment in Health Research Centre to Tackle Major Diseases (May 3, 2013), https://www.gov.uk/government/news/30-million-investment-inhealth-research-centre-to-tackle-major-diseases; see Terry, supra note 19, at 394 (“It will not be long until patient level information is combined with large existing data sets [that] will generate far more accurate predictive modeling, personalization of care, assessment of quality and value for many more conditions, and help providers better manage population health and risk-based reimbursement approaches.” (quoting Robert Kocher & Bryan Roberts, Meaningful Use of Health IT Stage 2: The Broader Meaning, HEALTH AFF.
Highlights
  • Big Data analytics have been widely publicized in recent years, with many in the business and science worlds focusing on how large datasets can
  • Unlike previous computational models that exploited known sources of personally identifiable information (“PII”) directly, such as behavioral targeting,[2] Big Data has radically expanded the range of data that can be personally identifying.[3]
  • As one noted health scholar emphasizes, Health Insurance Portability and Accountability Act (HIPAA)/HITECH’s security and privacy standards for electronic health records apply to entities of health plans, health care clearinghouses, and health care providers; in contrast, it is unclear whether these regulations will apply to organizations that are not so characterized, but who still receive personal health information from individuals or by generating it through Big Data.[57]
  • Big Data presents many challenges for privacy, to which this Article posits a model of procedural data due process as a response
  • See Loi 2004-801 du 6 août 2004 relative à la protection des personnes physiques à l'égard des traitements de données à caractère personnel et modifiant la loi 78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés (1) [Law 2004-801 of August 6, 2004 regarding the Protection of Individuals Regarding their Personal Data and modifying Law 78-17 relating to Data Processing, Files, and Freedoms], JOURNAL OFFICIEL DE LA RÉPUBLIQUE FRANÇAISE [J.O.] [OFFICIAL GAZETTE OF FRANCE], Aug. 6, 2004, p. 14063
Results
  • As one noted health scholar emphasizes, HIPAA/HITECH’s security and privacy standards for electronic health records apply to entities of health plans, health care clearinghouses, and health care providers; in contrast, it is unclear whether these regulations will apply to organizations that are not so characterized, but who still receive personal health information from individuals or by generating it through Big Data.[57] even health information—one of the most highly protected types of personal information—will be increasingly vulnerable in the context of Big Data and predictive analytics.
  • Big Data’s ability to analyze large amounts of data may lead to predictive privacy harms for individuals targeted by law enforcement.
  • [Vol 55:93 courts of law and administrative proceedings may be well-suited for an analogous system regulating private use of Big Data to mitigate predictive privacy harms.
  • 99 Property and other interests are implicated, especially as Big Data analytics are integrated into decisions concerning housing opportunities, employment, and credit provisioning.[100] predictive privacy harms seem well-suited for due process protection in terms of the type of subject matter covered.
Conclusion
  • In addition to Supreme Court precedent, another valuable source for identifying elements of procedural due process is the seminal 1971 article, Some Kind of Hearing, by Judge Henry Friendly.[120] Similar to the balancing test in Mathews, Judge Friendly emphasizes that there is no specific checklist of required procedures.[121] Rather, the appropriate process should consider and select a set of potentially useful procedures based on the characteristics of the particular matter, such as the severity of the deprivation and the government interest at stake.[122] He notes that civil procedural due process has moved beyond regulatory areas such as disability and welfare[123] For example, in 1959, in Greene v.
  • See Loi 2004-801 du 6 août 2004 relative à la protection des personnes physiques à l'égard des traitements de données à caractère personnel et modifiant la loi 78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés (1) [Law 2004-801 of August 6, 2004 regarding the Protection of Individuals Regarding their Personal Data and modifying Law 78-17 relating to Data Processing, Files, and Freedoms], JOURNAL OFFICIEL DE LA RÉPUBLIQUE FRANÇAISE [J.O.] [OFFICIAL GAZETTE OF FRANCE], Aug. 6, 2004, p. 14063
Study subjects and analysis
airline travelers: 1500
In this regard, the required level of procedural safeguards varies directly with the importance of the affected private interest, the need for that particular safeguard in the given circumstances, and its utility.131. Furthermore, it varies inversely with the administrative burden and any other adverse consequencthat “[e]very week, approximately 1,500 airline travelers reportedly are mislabeled as terrorists due to errors in the data-matching program known as the ‘No Fly’ list”). 127 Citron, supra note 126, at 1256

Reference
  • 2 See Elspeth A. Brotherton, Big Brother Gets a Makeover: Behavioral Targeting and the Third-Party Doctrine, 61 EMORY L.J. 555, 558 (2012) (describing “behavioral targeting” as “an online advertising technique designed to deliver specific, targeted advertisements to Internet users based on their perceived interests,” and observing that companies are able to do this “by using sophisticated technology that tracks and gathers information about users’ online activity).
    Google ScholarLocate open access versionFindings
  • 3 See Ira S. Rubinstein, Big Data: The End of Privacy or a New Beginning?, 3 INT’L DATA PRIVACY L. 74, 75–76 (2013).
    Google ScholarFindings
  • 4 See id. at 76–77, 82–83; Omer Tene & Jules Polonetsky, Privacy in the Age of Big Data: A Time for Big Decisions, 64 STAN. L. REV. ONLINE 63, 65–66 (2012), http://www.stanfordlaw review.org/online/privacy-paradox/big-data, archived at http://perma.cc/U6ZQ-PSK6.
    Locate open access versionFindings
  • 5 Charles Duhigg, Psst, You in Aisle 5, N.Y. TIMES, Feb. 19, 2012, § 6 (Magazine), at 30.
    Google ScholarFindings
  • 14 See Omer Tene & Jules Polonetsky, Big Data for All: Privacy and User Control in the Age of Analytics, 11 NW. J. TECH. & INTELL. PROP. 239, 240 (2013) (defining Big Data to include personal data generated from a variety of sources).
    Google ScholarLocate open access versionFindings
  • 15 See danah boyd & Kate Crawford, Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon, 15 INFO. COMM. & SOC’Y 662, 663 (2012). Big Data raises numerous critical questions about all three of these uses. Id.
    Google ScholarLocate open access versionFindings
  • 21 See Jay Stanley, Eight Problems with “Big Data,” FREE FUTURE, (Apr. 25, 2012, 3:06 PM), http://www.aclu.org/blog/technology-and-liberty/eight-problems-big-data, archived at http://perma.cc/RF4U-VF8A.22 See infra notes 23–40 and accompanying text (expanded scope); infra notes 41–74 and accompanying text (potential harms).
    Locate open access versionFindings
  • 23 See Sharona Hoffman & Andy Podgurski, In Sickness, Health, and Cyberspace: Protecting the Security of Electronic Private Health Information, 48 B.C. L. REV. 331, 347–50 (2007) (discussing various means used to compile personal health information, including hackers, foreign data processes, public records, and consumer purchase information); see also Nicolas P. Terry, Big Data Proxies and Health Privacy Exceptionalism 19–21 (Ind. Univ. Robert H. McKinney Sch. of Law Research Paper, Paper No. 2013-36, 2013), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2320088, archived at http://perma.cc/7LZT-CF2P (noting that this expansion in the health care industry has focused on making this information available to the individual, rather than to health care providers).
    Findings
  • 24 See Report Finds Correlation Between Health Data Breaches, Fraud Cases, IHEALTHBEAT (Apr. 30, 2012), http://www.ihealthbeat.org/articles/2013/4/30/report-finds-correlation-betweenhealth-data-breaches-fraud-cases.aspx, archived at http://perma.cc/S4AM-ZAKH.
    Findings
  • 29 See Terry, supra note 23, at 19–21. “Quantified Self” is a program that allows patients to track their activity and other health inputs such as heart rate and oxygen levels to improve their lifestyles. See Quantified Self: Self Knowledge Through Numbers, QUANTIFIED SELF, http://www.quantifiedself.com, archived at http://perma.cc/VUZ2-XBJN (last visited Jan.15, 2014). Similarly, “PatientsLikeMe” allows those living with health conditions such as cancer and diabetes to share information about their treatment and symptoms in order to aggregate information and provide suggestions for possible steps. See About Us, PATIENTSLIKEME, http://www.patientslikeme.com/about, archived at http://perma.cc/E35S-8AYZ (last visited Jan.15, 2014).
    Locate open access versionFindings
  • 30 Counting Every Moment, ECONOMIST (May 3, 2012), http://www.economist.com/node/21548493, archived at http://perma.cc/YW74-Z44P.
    Findings
  • 31 See generally Paul Schwartz & Dan Solove, The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86 N.Y.U. L. REV. 1814 (2001) (arguing for a more flexible approach to PII that tracks the “risk of identification” along a spectrum and attributing the shortcomings of the current PII approach to the increase of PII generated by Big Data).
    Google ScholarLocate open access versionFindings
  • 37 Terry, supra note 19, at 391–92. Big Data’s ability to synthesize previously available information from various datasets similarly threatens privacy by enabling the “re-identification” of personal information or identities that have been stripped away. See generally Arvind Narayanan & Vitaly Shmatikov, Robust De-Anonymization of Large Sparse Datasets, 2008 IEEE SYMP. ON SEC. & PRIVACY 111 (demonstrating via an anonymous Netflix user database that users may be identified with as few as five personal attributes); Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. REV. 1701 (2010) (highlighting the ease of re-identifying anonymous datasets, discussing how this harms current privacy law, and suggesting solutions).
    Google ScholarLocate open access versionFindings
  • 38 Kamalika Chaudhuri & Daniel Hsu, Sample Complexity Bounds for Differentially Private Learning, 19 JMLR: WORKSHOP & CONF. PROC. 155, 155–56 (2011).
    Google ScholarLocate open access versionFindings
  • 41 See Fair Housing Act of 1968, 42 U.S.C. § 3604(c) (2006). The Fair Housing Act of 1968 prohibits the making, printing, or publication of any “notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” Id. Further-
    Google ScholarLocate open access versionFindings
  • 43 See generally Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (noting that a roommate search service’s requirement for users to disclose sex, sexual orientation, and family status may be discriminatory and violative of the Fair Housing Act); Chi. Lawyers’ Comm. for Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666 (7th Cir. 2008) (noting that the publication of housing advertisements prohibiting minorities and children may be discriminatory and violative of the Fair Housing Act).
    Google ScholarLocate open access versionFindings
  • 44 See generally, e.g., Hous. Opportunities Made Equal, Inc. v. Cincinnati Enquirer, Inc., 943 F.2d 644 (6th Cir. 1991) (considering evidence of advertisements that featured predominantly white models as discriminatory and liable under § 3604(c)); Ragin v. N.Y. Times Co., 923 F.2d 995 (2d Cir. 1991) (same); United States v. Hunter, 459 F.2d 205 (4th Cir. 1972) (considering evidence of newspaper advertisements that described rental as a “white home” as discriminatory and liable under § 3604(c)).
    Google ScholarLocate open access versionFindings
  • 45 See generally JOSEPH TUROW, THE DAILY YOU: HOW THE NEW ADVERTISING INDUSTRY IS DEFINING YOUR IDENTITY AND YOUR WORTH (2012) (discussing how media, including advertisements and entertainment, uses Big Data to acquire and create individual profiles for consumers).
    Google ScholarLocate open access versionFindings
  • 46 See generally Woodrow Hartzog & Evan Selinger, Big Data in Small Hands, 66 STAN. L. REV. ONLINE 81 (2013), http://www.stanfordlawreview.org/online/privacy-and-big-data/big-datasmall-hands, archived at http://perma.cc/6B9P-VGTL (highlighting Big Data’s power to generate highly detailed individual profiles with little social media information); Michael Kosinski, et al., Private Traits and Attributes Are Predictable from Digital Records of Human Behavior, 110 PROC. NAT’L ACAD. SCI.5802 (2013) (finding that “highly sensitive personal attributes” could be predicted with high degrees of success from “Facebook Likes”).
    Locate open access versionFindings
  • 47 See Equal Credit Opportunity Act (ECOA), 15 U.S.C. §§ 1691–1691f (2012).
    Google ScholarFindings
  • 48 Michael Fertik, The Rich See a Different Internet Than the Poor, SCI. AM., (Feb. 18, 2013), http://www.scientificamerican.com/article.cfm?id=rich-see-different-internet-than-the-poor, archived at http://perma.cc/3PCW-GBDB (stating that if Big Data analysis indicates a poor credit record for the user, “you won’t even see a credit offer from leading lending institutions, and you won’t realize that loans are available to help you with your current personal or professional priorities”).
    Locate open access versionFindings
  • 51 See Schwartz & Solove, supra note 31, at 1841–45 (explaining how Big Data can transfer previously anonymous data into PII through re-identification).
    Google ScholarFindings
  • 52 See Cynthia Dwork & Deirdre Mulligan, It’s Not Privacy, and It’s Not Fair, 66 STAN. L. REV. ONLINE 35, 36–38 (2013), http://www.stanfordlawreview.org/online/privacy-and-big-data/its-not-privacy-and-its-not-fair, archived at http://perma.cc/35X-E9XS; Ian Kerr & Jessica Earle, Prediction, Preemption, Presumption: How Big Data Threatens Big Picture Privacy, 66 STAN. L. REV. ONLINE 65, 69 (2013), http://www.stanfordlawreview.org/online/privacy-and-big-data/pre diction-preemption-presumption, archived at http://perma.cc/CXU5-54V2.
    Locate open access versionFindings
  • 57 Terry, supra note 19, at 386 (questioning the value of HIPAA/HITECH protections, which are “designed to keep unauthorized data aggregators out of our medical records,” when Big Data “allows the creation of surrogate profiles of our medical selves”); see Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules, 78 Fed. Reg. 5566 (Jan. 25, 2013) (to be codified at 45 C.F.R. pts. 160, 164). In addition to circumventing protected domains entirely, Big Data may also benefit from one of a number of carve-outs to traditionally protected HIPAA/HITECH domains. Terry, supra note 19, at 408. For example, the Big Data task of “running data analytics against a hospital’s [Electronic Medical Records] data” in order to “look[] for disease predictors” may be categorized as a “quality improvement under ‘health care operations,’” and therefore be exempt from regulation. Id.
    Google ScholarLocate open access versionFindings
  • 58 See Zach Friend, Predictive Policing: Using Technology to Reduce Crime, FBI L. ENFORCEMENT BULL. (Apr. 9, 2013), http://www.fbi.gov/stats-services/publications/law-enforce ment-bulletin/2013/April/predictive-policing-using-technology-to-reduce-crime, archived at http://perma.cc/D8GC-2EDC.
    Locate open access versionFindings
  • 62 See id. 63 See id. 64 David Gray & Danielle Citron, The Right to Quantitative Privacy, 98 MINN. L. REV. 62, 67 (2013).
    Google ScholarLocate open access versionFindings
  • 67 United States v. Jones, 132 S. Ct. 945, 956 (2012) (Sotomayor, J., concurring).
    Google ScholarLocate open access versionFindings
  • 69 Jordan Robertson, How Big Data Could Help Identify the Next Felon—or Blame the Wrong Guy, BLOOMBERG (Aug. 15, 2013, 12:01 AM), http://www.bloomberg.com/news/2013-08-14/how-big-data-could-help-identify-the-next-felon-or-blame-the-wrong-guy.html, archived at http://perma.cc/HEQ8-5PNT.
    Locate open access versionFindings
  • 71 See Second Chance Act of 2007, 42 U.S.C. § 17501 (Supp. V 2011) (seeking to reduce recidivism rates by providing employment, housing, and other assistance to non-violent criminal offenders).
    Google ScholarFindings
  • 72 See Lahny R. Silva, Clean Slate: Expanding Expungements and Pardons for Non-Violent Federal Offenders, 79 U. CIN. L. REV. 155, 164–74, 198–99 (2011) (stating that criminal convictions bars individuals from various opportunities and that legislation designed to expunge their records may decrease recidivism).
    Google ScholarLocate open access versionFindings
  • 73 See, e.g., Jones, 132 S. Ct. at 956; United States v. Katzin, 732 F.3d 187, 193–94 (3d Cir. 2013) (considering whether GPS data may be included as evidence if authorities obtained the GPS without a warrant).
    Google ScholarLocate open access versionFindings
  • 75 For example, the Electronic Communications Privacy Act of 1986 (ECPA) prohibits the unauthorized collection of communications content;76 the Fair Credit Reporting Act prohibits the use of financial records for certain purposes;77 and the Video Privacy Protection Act of 1988 prohibits the disclosure of video rental records.78
    Google ScholarFindings
  • 75 Daniel J. Solove, A Taxonomy of Privacy, 154 U. PA. L. REV. 477, 484–552 (2006) (detailing these three categories).
    Google ScholarLocate open access versionFindings
  • 76 See 18 U.S.C. §§ 2510–2522, 2701–2711, 3121–3127 (2012). 77 15 U.S.C. §§ 1681–1681t (2012). 78 18 U.S.C. § 2710 (2012), amended by Video Privacy Protection Act Amendments Act of 2012, Pub. L. No. 112-258, 126
    Google ScholarFindings
  • 79 See Terry, supra note 14, at 257–63 (describing the difficulties in applying privacy law to Big Data); Schwartz & Solove, supra note 31, at 1845–47 (describing the difficulties of characterizing Big Data results as PII).
    Google ScholarFindings
  • 83 In February 2012, the
    Google ScholarLocate open access versionFindings
  • 80 Schwartz & Solove, supra note 31, at 1845–47.
    Google ScholarFindings
  • 82 See ROBERT GELLMAN, FAIR INFORMATION PRACTICES: A BASIC HISTORY 1 (2013), http://bobgellman.com/rg-docs/rg-FIPShistory.pdf, archived at http://perma.cc/ZBY-5LDP.83 FED. TRADE COMM’N, PRIVACY ONLINE: A REPORT TO CONGRESS 7–11 (1998), available at http://www.ftc.gov/sites/default/files/documents/public_events/exploring-privacy-roundtable-series/priv-23a.pdf, archived at http://perma.cc/PC3T-XMQQ; see FED. TRADE COMM’N, PROTECTING CONSUMER PRIVACY IN AN ERA OF RAPID CHANGE 11 (2012), available at http://www.ftc.gov/os/2012/03/120326privacyreport.pdf, archived at http://perma.cc/DE92-6R73.
    Locate open access versionFindings
  • 87 See Dwork & Mulligan, supra note 52, at 36–38; Natasha Singer, Acxiom Lets Consumers See Data It Collects, N.Y. TIMES, Sept. 5, 2013, at B6. For example, one notorious data broker, Acxiom, now lets customers see and change the data it collects about them individually. Singer, supra. Acxiom, however, does not allow them to change the analytics it uses to assess the data for sale to marketers. Id. This is a sign that transparency and regulation of individual data collection is not likely to serve as an effective gatekeeping function for controlling privacy harms. See id.
    Google ScholarLocate open access versionFindings
  • 88 See, e.g., Meg L. Ambrose, It’s About Time: Privacy, Information Life Cycles, and the Right to Be Forgotten, 16 STAN. TECH. L. REV. 369, 385–87 (2013) (highlighting concerns about the right to be forgotten); Jeffrey Rosen, The Right to Be Forgotten, 64 STAN. L. REV. ONLINE 88, 90–92, shift in thinking that can approach the problem with the same dynamic and flexible capacities that Big Data itself provides.
    Google ScholarLocate open access versionFindings
  • 89 Joint Anti-Fascist Refugee Comm. v. McGrath, 341 U.S. 123, 170 (1951) (Frankfurter, J., concurring).
    Google ScholarFindings
  • 90 See Steve Lohr, Big Data, Trying to Build Better Workers, N.Y. TIMES, Apr. 21, 2013, at BU4 (illustrating how Big Data is used by employers to identify ideal traits in job applicants).
    Google ScholarFindings
  • 91 Ryan C. Williams, The One and Only Substantive Due Process Clause, 120 YALE L.J. 408, 419–21 (2010).
    Google ScholarLocate open access versionFindings
  • 92 See Fredric M. Bloom, Information Lost and Found, 100 CALIF. L. REV. 636, 636 (2012).
    Google ScholarLocate open access versionFindings
  • 101 See Hamdi v. Rumsfeld, 542 U.S. 507, 533 (2004); Cleveland Bd. of Educ. v. Loudermill, 470 U.S. 532, 542 (1985) (“An essential principle of due process is that a deprivation of life, liberty, or property ‘be preceded by notice and opportunity for hearing appropriate to the nature of the case.’” (quoting Mullane v. Cent. Hanover Bank & Trust Co., 339 U.S. 306, 313 (1950))). 102 424 U.S. 319, 323–26, 333–35 (1976).
    Google ScholarLocate open access versionFindings
  • 120 Henry J. Friendly, Some Kind of Hearing, 123 U. PA. L. REV. 1267 passim (1975).
    Google ScholarLocate open access versionFindings
  • 126 Compare 360 U.S. at 507–08 (holding that governmental labeling of an organization as Communist and subversive without an opportunity to be heard was a due process violation), with Danielle K. Citron, Technological Due Process, 85 WASH. U. L. REV. 1249, 1256 (2008) (noting harms addressed by due process in Greene and the errors and mistaken assumptions that have been revealed about the Transportation Security Administration’s “No Fly” list—another product of Big Data—further support the need for data due process.127
    Google ScholarLocate open access versionFindings
  • 127 Citron, supra note 126, at 1256. Although some due process cases have held that reputational harms are more appropriate for the province of tort law, due process can apply when reputation leads to deprivation of liberty or property. Compare Paul v. Davis, 424 U.S. 693, 699–701 (1976) (denying due process claim over stigmatic harm related to future employment opportunities stemming from inclusion in a flyer of “active shoplifters”), with Wisconsin v. Constantineau, 400 U.S. 433, 436–37 (1971) (holding that a ban on distributing alcoholic drinks to persons whose names were “posted” as excessive drinkers was a deprivation of liberty because it altered or extinguished a distinct right previously recognized by state law).
    Google ScholarLocate open access versionFindings
  • 130 See Friendly, supra note 120, at 1277 (“A hearing in its very essence demands that he who is entitled to it shall have the right to support his allegations by argument however brief, and, if need be, by proof, however informal.” (quoting Londoner v. Denver, 210 U.S. 373, 386 (1908))). To support his view, Judge Friendly referenced such authority as Justice Felix Frankfurter’s concurring opinion in McGrath, that “even in the case of ‘a person in jeopardy of serious loss,’ that one must be given ‘notice of the case against him and opportunity to meet it.’” Id. (quoting 341 U.S. at 171–72 (Frankfurter, J., concurring)). Judge Friendly further relied upon English common law, which characterized due process as “a fair opportunity... for correcting or contradicting any relevant statement prejudicial to [one’s] view.” Id. (quoting Board of Educ. v. Rice, [1911] A.C. 179, 182).
    Google ScholarLocate open access versionFindings
  • 153 Martin H. Redish & Lawrence C. Marshall, Adjudicatory Independence and the Values of Procedural Due Process, 95 YALE L.J. 455, 474 (1986). Redish and Marshall stress that a procedural due process model should have flexible procedural mechanisms that maintain the due process clause’s long-existing values. Id. 154 See id. at 474–75.
    Google ScholarLocate open access versionFindings
  • 162 See, e.g., Declan Butler, When Google Got Flu Wrong, 494 NATURE 155, 155–56 (2013) (explaining Google Flu Trends’ faulty overestimation when it used Big Data to determine peak flu levels and the effects of mistaken reliance on its analysis); Kate Crawford, The Hidden Biases in Big Data, HARV. BUS. REV. (Apr. 1, 2013, 2:00 PM), http://blogs.hbr.org/cs/2013/04/the_hidden_biases_in_big_data.html, archived at http://perma.cc/B3U8-K67A (discussing Big Data’s signal problems in such scenarios as analysis of GPS data to detect potholes).
    Locate open access versionFindings
  • 163 See generally Kate Crawford & Catherine Lumby, Networks of Governance: Users, Platforms, and the Challenges of Networked Media Regulation, 1 INT’L J. TECH. POL’Y & LAW 270 (2013). By the term “governance,” we are referring primarily to networked or technological governance, which involves both governmental aspects as well as private and individual ones. Id.
    Google ScholarLocate open access versionFindings
  • 164 Robert J. MacCoun, Voice, Control, and Belonging: The Double-Edged Sword of Procedural Justice, 1 ANN. REV. L. & SOC. SCI. 171, 171–73 (2005) (highlighting “the ability to tell
    Google ScholarLocate open access versionFindings
  • 166 See Crawford, supra note 162; Kate Crawford, Think Again: Big Data, FOREIGN POL’Y (May 9, 2013), http://www.foreignpolicy.com/articles/2013/05/09/think_again_big_data, archived at http://perma.cc/67SQ-5BXK.167 Nathan S. Chapman & Michael W. McConnell, Due Process as Separation of Power, 121 YALE L.J.1672, 1782–92 (2012).
    Locate open access versionFindings
  • 170 Fletcher v. Peck, 10 U.S. (6 Cranch) 87, 136 (1810); see Chapman & McConnell, supra note 167, at 1733 (emphasizing the importance of applying the laws equally to all people; suggesting that separation of powers serves as an effective vehicle to accomplish this; and using Chief Justice Marshall’s opinion to support this argument).
    Google ScholarFindings
  • 171 See, e.g., Citron, supra note 126, at 1301–13; Richard H. Fallon, Jr., Some Confusion About Due Process, Judicial Review, and Constitutional Remedies, 93 COLUM. L. REV. 309, 311, 336–37 (1993); Jerry L. Mashaw, The Management Side of Due Process: Some Theoretical and Litigation Notes on the Assurance of Accuracy, Fairness, and Timeliness in the Adjudication of Social Welfare Claims, 59 CORNELL L. REV. 772, 815–16 (1974).
    Google ScholarLocate open access versionFindings
  • 193 See Friendly, supra note 120; Redish & Marshall, supra note 153; see also Chapman & McConnell, supra note 167, at 1774 (highlighting Supreme Court’s emphasis that “to comply with due process, statutes must either provide for the use of common law procedures or, if they do not, employ alternative procedures that the courts would regard as equivalently fair and appropriate”).
    Google ScholarFindings
  • 194 See 18 U.S.C. §§ 2510–2522 (2012).
    Google ScholarFindings
  • 195 See, e.g., Eric Markowitz, Meet a Start-Up With a Big Data Approach to Hiring, INC., http://www.inc.com/eric-markowitz/how-data-can-help-you-recruit-talented-engineers.html, archived at http://perma.cc/4HVV-39TJ (last updated Sept.19, 2013) (describing the development of a “tech talent-sourcing platform” using Big Data); Joseph Walker, Meet the New Boss: Big Data, WALL ST. J. (Sept.20, 2012, 11:16AM), http://online.wsj.com/article/SB1000087239639044389030457800625 2019616768.html, archived at http://perma.cc/SR7X-UNXT (discussing companies’ use of Big Data for hiring decisions).
    Locate open access versionFindings
  • 196 See, e.g., Online Privacy Protection Act of 2003, CAL. BUS. & PROF. CODE §§ 22575– 22579 (West 2008) (amended 2013) (requiring commercial operators of online services to conspicuously post privacy policies informing users of what PII is collected and how it will be used);
    Google ScholarLocate open access versionFindings
  • 197 See Citron, supra note 126, at 1305–06 (discussing the need for audit trails for the government’s administrative technological systems to facilitate meaningful notice to individuals); Dwork & Mulligan, supra note 52, at 38–39 (suggesting that bias testing and transparency of Big Data analytics may mitigate privacy harms); see also Consultative Comm. of the Convention for the Prot. of Individuals with Regard to Automatic Processing of Pers. Data [ETS No. 108], Propositions of Modernisation, COUNCIL OF EUR. 4–5 (Dec. 18, 2012), http://www.coe.int/t/dghl/standardsetting/dataprotection/TPD_documents/T-PD(2012)04Rev4_E_Convention%20108%20 modernised%20version.pdf, archived at http://perma.cc/N2QZ-QV8F (discussing “Rights of the data subject”).
    Locate open access versionFindings
  • 198 See Press Release, Fed. Trade Comm’n, FTC Names Edward W. Felten as Agency’s Chief Technologist; Eileen Harrington as Executive Director (Nov. 4, 2010), http://www.ftc.gov/opa/2010/11/cted.shtm, archived at http://perma.cc/EEY2-TJHV. This would also address concerns about standing, in which a single plaintiff might not have sufficient evidence to show an individual concrete harm in a particular case without gathering evidence through the litigation discovery process.
    Locate open access versionFindings
  • 200 See Gray & Citron, supra note 64, at 118–19 (noting that the predictive analytic systems of New York City’s DAS and Palantir both provide mechanisms for outside review). On the question of remedies, with procedural safeguards, one can imagine several common remedies for curing a violation. In the judicial system, a specific proceeding or determination might be invalidated, thereby forcing the agency or adjudicator to revisit the determination using proper processes. In the privacy context, there is some precedent for this in France. See Loi 2004-801 du 6 août 2004 relative à la protection des personnes physiques à l'égard des traitements de données à caractère personnel et modifiant la loi 78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés (1) [Law 2004-801 of August 6, 2004 regarding the Protection of Individuals Regarding their Personal Data and modifying Law 78-17 relating to Data Processing, Files, and Freedoms], JOURNAL OFFICIEL DE LA RÉPUBLIQUE FRANÇAISE [J.O.] [OFFICIAL GAZETTE OF FRANCE], Aug. 6, 2004, p. 14063.
    Google ScholarLocate open access versionFindings
Author
jason schultz
jason schultz
Your rating :
0

 

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn
小科