Skip to main content

Post-human ways of seeing security: the semiotics of automatic visual security management

Tampere University

New weapons and surveillance systems increasingly rely on machine vision to automatically detect threat and enmity. Yet, even though these technologies stir security concerns in civil society over how automation affects the ethics and understanding of security and insecurity, machine vision has not been investigated in security studies. This project maps controversies over automation of security among researchers, policymakers, and security professionals, and in civil society.

Background

The project develops theoretically innovative ways of seeing security as a concept describing the way visual identification of security relies on surprising conceptualisations of security and of seeing. The introduction of automated visual security management (AVSM) systems aims to create objective data on an activity that was previously always subjective - seeing. When combined with the existential questions of security, machine vision provokes contestation in civil society and among researchers and policymakers. The drive to automate seeing can create new insecurities and raise ethical questions about the inhumanity of computer vision, and computers’ decision-making in life-and-death situations.

Goal

While the development of smart visual security management capabilities is a hotly contested societal issue with far-ranging implications, security studies have yet to contribute to this debate and pay attention to how machine vision identifies and conceptualises security, what the ethical stakes in automating such identification are, and how AVSM employment reconfigures what security means for citizens. We therefore urgently need an exploration of the ways in which security is understood in automated visual security management systems. This project is the first systematic study of the ethics and security conceptions materialised in AVSM systems.

Contact persons

Rune Saugmann Andersen 

rune.saugmann [at] tuni.fi