An EPIC Talk with MADELEINE CLARE ELISH, Data & Society
Approx 50 minutes
Breathless rhetoric about AI has promised safer, more accurate systems that would take the “human out of the loop.” With more nuanced visions of AI, not to mention some high-profile catastrophes, the prevailing rhetoric now promises that keeping a “human in the loop” at key places will ensure effective oversight. But neither model is an accurate way to understand agency—complex dynamics of cooperation and control in human-AI systems—or who should be held accountable when something goes wrong.
In this talk Madeleine will outline the distributed nature of agency in sociotechnical systems and present the concept of a “moral crumple zone” to describe how responsibility for an action may be misattributed to a human actor who actually had limited control over the behavior of the system. Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system...
MADELEINE CLARE ELISH
Data & Society Research Institute
The wide-spread deployment of machine learning tools within healthcare is on the horizon. However, the hype around “AI” tends to divert attention toward the spectacular, and away from the more mundane and ground-level aspects of new technologies that shape technological adoption and integration. This paper examines the development of a machine learning-driven sepsis risk detection tool in a hospital Emergency Department in order to interrogate the contingent and deeply contextual ways in which AI technologies are likely be adopted in healthcare. In particular, the paper bring into focus the epistemological implications of introducing a machine learning-driven tool into a clinical setting by analyzing shifting categories of trust, evidence, and authority. The paper further explores the conditions of certainty in the disciplinary contexts of data science and ethnography, and offers a potential reframing of the work of doing data science and machine learning as “computational...