Skip to main content
University of Michigan Press Ebook Collection

University of Michigan Press
Ebook Collection

Browse Books Help
Get access to more books. Log in with your institution.

Your use of this Platform is subject to the Fulcrum Terms of Service.

Share the story of what Open Access means to you

a graphic of a lock that is open, the universal logo for open access

University of Michigan needs your feedback to better understand how readers are using openly available ebooks. You can help by taking a short, privacy-friendly survey.

  1. Home
  2. Books
  3. The Future of Digital Surveillance: Why Digital Monitoring Will Never Lose Its Appeal in a World of Algorithm-Driven AI

The Future of Digital Surveillance: Why Digital Monitoring Will Never Lose Its Appeal in a World of Algorithm-Driven AI

Yong Jin Park
Restricted You don't have access to this book. Please try to log in with your institution. Log in
Read Book Buy Book
  • Overview

  • Contents

Are humans hard-wired to make good decisions about managing their privacy in an increasingly public world? Or are we helpless victims of surveillance through our use of invasive digital media? Exploring the chasm between the tyranny of surveillance and the ideal of privacy, this book traces the origins of personal data collection in digital technologies including artificial intelligence (AI) embedded in social network sites, search engines, mobile apps, the web, and email. The Future of Digital Surveillance argues against a technologically deterministic view—digital technologies by nature do not cause surveillance. Instead, the shaping of surveillance technologies is embedded in a complex set of individual psychology, institutional behaviors, and policy principles.
  • Cover
  • Title Page
  • Copyright Page
  • Dedication
  • Contents
  • Preface
  • Acknowledgments
  • Part I: Introduction
    • Chapter 1. Putting Individual and Economic Determinants in Perspective
  • Part II: Fundamentals of Privacy and Surveillance
    • Chapter 2. A Perspective on Institutions
    • Chapter 3. A Perspective on Individuals
    • Chapter 4. A Perspective on Policy Principles and Regulation of Data Flow
  • Part III: Understanding the Future of AI and Its Challenge
    • Chapter 5. Ushering in the Era of Artificial Intelligence
  • Part IV: Conclusion
    • Chapter 6. Alternative Policy Principles, Options, and Recommendations
    • Chapter 7. The Future of Digital Surveillance
  • Appendix A. The Locus of Privacy Protection in the Marketplace
  • Appendix B. Empirical Evidence in Two Strands
  • Notes
  • References
  • Index
Citable Link
Published: 2021
Publisher: University of Michigan Press
ISBN(s)
  • 978-0-472-05484-8 (paper)
  • 978-0-472-12882-2 (ebook)
  • 978-0-472-07484-6 (hardcover)
Subject
  • Psychology
  • Sociology
  • Political Science:Political Economy

Resources

Search and Filter Resources

Filter search results by

Section

  • Chapter 11
  • Chapter 26
  • Chapter 34
  • Chapter 41
  • Chapter 57
  • Chapter 62
Filter search results by

Keyword

  • AI5
  • Acquisition1
  • Advertising Effect1
  • Amazon1
  • Audience1
  • more Keyword »
Filter search results by

Creator

  • Park, Yong Jin21
Filter search results by

Format

  • image21

Search Constraints

« Previous | 1 - 20 of 21 | Next »
  • First Appearance
  • Section (Earliest First)
  • Section (Last First)
  • Format (A-Z)
  • Format (Z-A)
  • Year (Oldest First)
  • Year (Newest First)
Number of results to display per page
  • 10 per page
  • 20 per page
  • 50 per page
  • 100 per page
View results as:
List Gallery

Search Results

Conceptual figure showing how two forces of institutions and users in response to new surveillance technologies interact to produce the outcome of surveillance and privacy.

Forces in Tension

From Chapter 1

Figure 1.1. Interactive Forces in Tension. Source: Modified from Neuman 1991.

How digital transformation creates the contrast between two models—the mass media model in which economic values of audiences reside in common tastes, whereas the broadband model in which values are in niche tastes intensifying audience surveillance techniques.

Contrast between Broadband and Media

From Chapter 2

Figure 2.1. Digital Transformation of Business Model

Differences in audience measurement techniques—(1) mass media measurement having margins of error between actual, measured, and imagined audiences and (2) broadband measurement narrowing those margins.

Gap among Measured, Actual, Imagined audience

From Chapter 2

Figure 2.2. Digital Transformation of Audience Measurement. Note: Arrow denotes confidence range of audience-user measurement.

Market inefficiency illustrated by the contrast between personalization, which entails the higher surveillance needs, and privacy protection, which entails the lower personalization.

Market Paradox of Privacy and Surveillance

From Chapter 2

Figure 2.3. Efficiency Gap in Marketplace Performance. Note: Denotes the degree of interaction between privacy and surveillance denotes the degree of personalization. Here a cutoff point for low and high is only conceptual to illustrate the interaction.

Percentages of sampled websites having each of privacy protection elements, respectively under Notice and Choice.

Status of Individual Privacy Protection Items

From Chapter 2

Figure 2.4. Marketplace Performance of Privacy Protection: Notice and Choice. Note: Percentage of the sampled websites shown.

Illustration of how privacy protection stays stagnant over time

Longitudinal Trend of Privacy Protection

From Chapter 2

Figure 2.5. Marketplace Performance of Privacy Protection Over Time. Note: Percentage of the sampled websites in each study shown.

How the demand for privacy and production for protection does not match over time.

Contrast between Concern and Protection

From Chapter 2

Figure 2.6. Marketplace Contrast between Privacy Consumption and Production. Note: For privacy protection, percentage of the sampled websites (Park 2008 and LaRose 2002) shown. See Appendix A for various sources for privacy concern.

Percentages of respondents answering correctly in each question

Surveillance Awareness and Policy Understanding

From Chapter 3

Figure 3.1. Distribution of Privacy Knowledge Items

Illustrative is the power of knowledge in translating one’s concern into privacy action.

Cognitive Power in Translating Concern into Action

From Chapter 3

Figure 3.2. Moderating Effects of Privacy Knowledge. Note: Privacy concern is informational, with the behavior indicating the level of privacy control in technical dimensions (Appendix B). *p < .05. Source: Knowledge Network data.

One of the primary examples in privacy paradox as most Internet users reside in convenient inaction for protection, while engaging in trading off personal data.

Convenient Inaction of Privacy Protection

From Chapter 3

Figure 3.3. Direct Effect of Willingness for Privacy Trade-Off. Note: Willingness is the measure of the extent to which an individual is willing to trade personal data for rewards, with the behavior denoting privacy control in the technical dimension (Appendix B).

How someone’s social background influences the person’s cognitive development related to privacy.

Socialization of Privacy Knowledge Acquisition

From Chapter 3

Figure 3.4. Acquisition of Privacy Knowledge according to Sociodemographic variables. Note: For KN sample, entries are odd ratios. The odds larger than 1 indicate the likelihood of the correct responses. Covariates (yearly experience; number of online accesses; daily use) are not shown in logistic regression. Policy understanding is an item asking about appropriation. Surveillance awareness is tone asking about transfer. Solid lines are for surveillance awareness; dotted ones are for policy understandings. For mobile sample, covariates (mobile familiarity; mobile access) are not shown in multivariate regression. Entries are standard coefficients. * p < .05; ** p < .01; ns = nonsignificant.

Comparative illustration of how EU privacy policy orientation differs from that of US approach.

Different Privacy Policy Regimes

From Chapter 4

Figure 4.1. Privacy Policy Orientation. Note: X = Emphasis on well-being, as opposed to business interest.welling between business and citizens. Y = Policy preference between market-driven and privacy-driven. Source: Modified from Dutton and Peltu 1996.

Mapping of how AI data-driven digital industries are vertically and horizontally concentrated in a small number of companies.

Mapping Data Concentration in Digital AI Transition

From Chapter 5

Figure 5.1. Personal Data Ecosystem in Vertical and Horizontal Concentration

Situational contexts in which AI and a person never achieve power equality in terms of data submission.

Prisoner's Dilemma of AI Data Submission

From Chapter 5

Figure 5.2. Data Submission (Person) and Use (AI) as a Prisoner’s Dilemma. Note: A = value for digital platform, such as advertising, target ad, microcustomization, and marketing. B = value for a person, such as access and customized and automatic suggestions. CA = cost for AI, the opportunity to microtarget a person. CB = cost for person, AI-generated automation.

Illustration of how physical biometric data, translated into data input, are then converted into positive or negative AI-based automated marketing.

AI Processing of Emotional Data in Automated Suggestions

From Chapter 5

Figure 5.3. Emotional Microtargeting in Automated Shopping

How political ad targets or wastes, based on individual scores, are created

AI Scoring of Political Data into Target and Waste

From Chapter 5

Figure 5.4. Political Microtargeting in Facebook–Cambridge Analytica

A contrast between AI-based personalized target model and mass media in their respective metrics in measuring advertising effectiveness.

AI Political Effect vs. Mass Media Effect Model

From Chapter 5

Figure 5.5. AI-Based Facebook Effect Metrics versus Mass-Media Effect Model. Note: A dotted line indicates potential connection. For instance, media buyers often engage in microtargeting as well as a mass-media campaign.

Illustrative sequence in which AI based on personal data can have two different routes of error—one false positive and the other false negative. AI in context of data marketing has less concern about false positive, thus churning out outputs, which can turn out to be inaccurate. This is one of the technical reasons why privacy in this context serves as a noise to the function of AI.

AI False Positive vs. False Negative Assessment

From Chapter 5

Figure 5.6. Sequence of AI Decision Error: Types I and II

This shows how data submission, esp. at the point of the first entry, rewards a user. As no submission simply amounts to no access, this is a steep binary pattern in which the second data submission accelerates reward.

AI Reward Return Pattern in Data Submission

From Chapter 5

Figure 5.7. Logistic Growth Pattern in AI Return of Data

This illustrates a regulatory solution that this book proposes: Instead of relying on one mode of action (i.e., government) over the other (market), this book envisions a possibility in which different types of regulatory solutions can emerge and complement strengths and weaknesses of each measure, and henceforth, pragmatic solutions can be made far from ideological ones.

Regulatory Codes for Privacy > Surveillance

From Chapter 6

Figure 6.1. Regulatory Codes for Privacy over Surveillance. Source: modified from Lessig 2009.

  • « Previous
  • Next »
  • 1
  • 2
University of Michigan Press Contact Us

UMP EBC

  • Browse and Search
  • About UMP EBC
  • Impact and Usage

Follow Us

  • UMP EBC Newsletter
  • Twitter
  • Facebook
  • Instagram
  • YouTube

Quicklinks

  • Help/FAQ
  • Title List
  • MARC Records
  • KBART Records
  • Usage Stats
© 2022, Regents of the University of Michigan · Accessibility · Preservation · Privacy · Terms of Service
Powered by Fulcrum logo · Log In
x This site requires cookies to function correctly.