• HC Visitor
Skip to content
Information Ecosystems
Information Ecosystems

Information, Power, and Consequences

Primary Navigation Menu
Menu
  • InfoEco Podcast
  • InfoEco Blog
  • InfoEco Cookbook
    • About
    • Curricular Pathways
    • Cookbook Modules

How Should We Handle Personal Data, Privacy, and Leisure Time in the Information Age?

By: Jane Rohrer
On: October 25, 2019
In: Mario Khreiche
Tagged: Amazon, artificial intelligence, burnout, mechanization, Uber

On October 25th, the Seminar was led through a discussion on automation, AI, and the future of work by a fellow participant: recent Virginia Tech graduate and Information Ecosystems Sawyer Seminar Postdoctoral fellow Mario Khreiche. Mario discussed his recent publication in Fast Capitalism, “The Twilight of Automation,” in which he theorizes about “the scope and rate whereby human labor will be replaced by machines” (117).

Throughout both this conversation on the 25th, and during his public talk the day before, Khreiche clarified that his approach is not a luddite one; he was quick to point out that AI and automation is, first of all, far from a recent concern—historical perspectives can do much to quiet our contemporary moments of panic—and secondly, that reducing AI and automation to its flaws would be, well, reductionist. Anyone who has spent time, for example, formatting citations on a laptop could imagine how much slower and more painful the whole ordeal would be on a manual typewriter. Khreiche has done an excellent job, then, of illuminating necessary critiques about automation without ignoring its multitude of perks.

Khreiche spent much of his time examining the “gig economy” or “gigconomy,” in which temporary, part-time jobs are increasingly replacing the availability of lifelong careers. Khreiche specifically mentioned a part-time earner’s potential amalgamation of Uber, TaskRabbit, Amazon delivery, and Airbnb—a combination of gigs which I have actually met several millennials currently dabbling in at the same time. For those of you asking: what’s the big deal with that? As Khreiche himself points out, “automation unfolds evenly across socioeconomic domains,” increasing the precarity of part-time employees while also “stealthily increasingly wealth and power for service providers” (117). In other words, an already-vulnerable part-time employer is unlikely to be doing anything beyond living paycheck to paycheck, all with the knowledge that their employer could happily drop them at any time; the service provider, on the other hand, is likely to keep getting richer and more powerful regardless of how they treat employees.

One of Khreiche’s most salient points in this vein is, I think, the often invisible “work” being done by those in the Information Age, and the sneaky ways this work is being “rebranded” as not-work. Again, this is, in some ways, nothing new within the scope of human history; there have almost always been those who benefit from the silent labor of society’s most vulnerable. But, as Khreiche pointed out both in his essay and during his talks, our current time is uniquely positioned to blur the line between work and play—between leisure and non-leisure times—and threatens, often, the basic subjectivity and free will of employees. He mentions that Uber, for example, might suggest to a driver, “you’re $10 away from making $330 in net earnings. Are you sure you want to go offline?” (118). When a driver is not contracted to work a set series of hours, what are the ethics of pushing them to, say, blow off a sports game with friends or arrive late to a daycare pickup in order to reach Uber’s suggested quota?

Furthermore, Khreiche suggests, there is an even less visible form of labor happening: Uber, for one, “rigorously collects data” on its drivers in order to optimize its own performance (118). There are other examples of this type of data collection: Google has been criticized for taking users’ data “whether they like it or not,” sparking debates about the ostensibly positive results of automation—restaurant recommendations, flight delay notifications, etc—versus the “deep digging” into your data required to produce it. And so it’s worth saying, on the topic of invisible labor, that Google and others make massive amounts of money every day by on the back of user data. This, along with the increasing frequency of crowdsourced advertisements, marketing campaigns—and even potato chip flavors—should perhaps prompt us to ask: shouldn’t I be getting a cut of the profits?

During Khreiche’s talks, I was reminded of Sarah T. Roberts’ recently published Behind the Screen: Content Moderation in the Shadows of Social Media. In it, Roberts describes the very human and very often upsetting, exploitative lives of social media’s moderators. When a Facebook user reports a video for explicit content or someone on Twitter flags a tweet for harassment, a human—a contracted worker who is underpaid and without healthcare—is pinged to evaluate and remove it, one-by-one. Roberts’ book illuminated, for me, that automation does not only render itself invisible—sneaking data away from us behind “Terms & Conditions” agreements—but also the human labor that is still, for now, filling in the gaps in tasks which AI cannot yet perform. This is another startling example of tech giants benefitting enormously off the largely unseen, free or cheap labor of other humans—the embodied experiences of individual employees being used as collateral in the pursuit of progress narratives. But there are, Khreiche reminds us, important positives and potentialities to keep in mind.

Near the end of our time last Friday, the Sawyer Seminar group turned to matters of preserving and restoring equity and privacy within this “Twilight” of automation. It was a good reminder that, as I’ve already stated, AI is not without its potentiality and benefits. There are, for example, global alternatives to Lyft and Uber whose smaller scales make attending to local needs  and anomalies easier and exploiting drivers more difficult. We also discussed the practice (in the humanities, at least) of listing one’s “email hours” (times of day in which you can expect a response) in their email signature. I have recently put my “email hours” at the top of my syllabus, letting my students know that I won’t be answering that 2:00 on Sunday am email until Monday during business hours. These small forms of resistance against the otherwise often insidious way technology and automation can chip away at our free time, and our “free” labor, pose examples of how we can configure AI to work with us and for us. Our histories are full of examples of this very thing—from early AI methods preventing cholera outbreaks to massive improvements in radiology. Mario Khreiche’s talks were an excellent reminder that our futures, then, are not doomed or destined to fail simply because of automation; it is a matter, rather, of keeping a critical, equi

2019-10-25
Previous Post: Are services like Uber and Amazon’s Mechanical Turk Ethical? Sawyer Seminar turns to automation, future of work
Next Post: Open Data and data infrastructure across disciplines

Invited Speakers

  • Annette Vee
  • Bill Rankin
  • Chris Gilliard
  • Christopher Phillips
  • Colin Allen
  • Edouard Machery
  • Jo Guldi
  • Lara Putnam
  • Lyneise Williams
  • Mario Khreiche
  • Matthew Edney
  • Matthew Jones
  • Matthew Lincoln
  • Melissa Finucane
  • Richard Marciano
  • Sabina Leonelli
  • Safiya Noble
  • Sandra González-Bailón
  • Ted Underwood
  • Uncategorized

Recent Posts

  • EdTech Automation and Learning Management
  • The Changing Face of Literacy in the 21st Century: Dr. Annette Vee Visits the Podcast
  • Dr. Lara Putnam Visits the Podcast: Web-Based Research, Political Organizing, and Getting to Know Our Neighbors
  • Chris Gilliard Visits the Podcast: Digital Redlining, Tech Policy, and What it Really Means to Have Privacy Online
  • Numbers Have History

Recent Comments

    Archives

    • June 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • October 2020
    • September 2020
    • May 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019

    Categories

    • Annette Vee
    • Bill Rankin
    • Chris Gilliard
    • Christopher Phillips
    • Colin Allen
    • Edouard Machery
    • Jo Guldi
    • Lara Putnam
    • Lyneise Williams
    • Mario Khreiche
    • Matthew Edney
    • Matthew Jones
    • Matthew Lincoln
    • Melissa Finucane
    • Richard Marciano
    • Sabina Leonelli
    • Safiya Noble
    • Sandra González-Bailón
    • Ted Underwood
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Tags

    Algorithms Amazon archives artificial intelligence augmented reality automation Big Data Bill Rankin black history month burnout cartography Curation Darwin Data data pipelines data visualization digital humanities digitization diversity Education election maps history history of science Information Information Ecosystems Information Science Libraries LMS maps mechanization medical bias medicine Museums newspaper Open Data Philosophy of Science privacy racism risk social science solutions journalism Ted Underwood Topic modeling Uber virtual reality

    Menu

    • InfoEco Podcast
    • InfoEco Blog
    • InfoEco Cookbook
      • About
      • Curricular Pathways
      • Cookbook Modules

    Search This Site

    Search

    The Information Ecosystems Team 2023

    This site is part of Humanities Commons. Explore other sites on this network or register to build your own.
    Terms of ServicePrivacy PolicyGuidelines for Participation