- About
- Courses
- Research
- Computational Social Science
- Critical Data Studies
- Data Science
- Economics and Information
- Education Technology
- Ethics, Law and Policy
- Human-Computer Interaction
- Human-Robot Interaction
- Incentives and Computation
- Infrastructure Studies
- Interface Design and Ubiquitous Computing
- Natural Language Processing
- Network Science
- Social Computing and Computer-supported Cooperative Work
- Technology and Equity
- People
- Career
- Undergraduate
- Info Sci Majors
- BA - Information Science (College of Arts & Sciences)
- BS - Information Science (CALS)
- BS - Information Science, Systems, and Technology
- MPS Early Credit Option
- Independent Research
- CPT Procedures
- Student Associations
- Undergraduate Minor in Info Sci
- Our Students and Alumni
- Graduation Info
- Contact Us
- Info Sci Majors
- Masters
- PHD
- Prospective PhD Students
- Admissions
- Degree Requirements and Curriculum
- Grad Student Orgs
- For Current PhDs
- Diversity and Inclusion
- Our Students and Alumni
- Graduation Info
- Program Contacts and Student Advising
Helen Nissenbaum is professor of Information Science at Cornell Tech. Her research takes an ethical perspectives on policy, law, science, and engineering relating to information technology, computing, digital media and data science. Topics have included privacy, trust, accountability, security, and values in technology design. Her books include "Obfuscation: A User's Guide for Privacy and Protest", with Finn Brunton (MIT Press, 2015) and "Privacy in Context: Technology, Policy, and the Integrity of Social Life" (Stanford, 2010). Grants from the NSF, AFOSR, and the U.S. DHHS-ONC have supported her work. Recipient of the 2014 Barwise Prize of the American Philosophical Association, Nissenbaum has contributed to privacy-enhancing software, including TrackMeNot and AdNauseam. Nissenbaum holds a Ph.D. in philosophy from Stanford University and a B.A. (Hons) in philosophy and mathematics from the University of the Witwatersrand, South Africa.
Talk: "Must Privacy Give Way to Use Regulation?"
Abstract: There is considerable support for deregulating information collection and regulating only its use. Proponents argue that ex ante constraints on collection not only are impossible to enforce but will stifle the enormous potential of AI and big data. Imposing judicious constraints on use, they assert, is the answer. In Nissenbaum's talk, she disputes this popular logic. While there is no denying the unprecedented challenges of big data and AI to traditionally conceived privacy regulation, giving up on collection regulation will weaken one of the cornerstones of a free society with no guarantee that data so released will serve the general good.