Adaptive Perspectives on Human-Technology Interaction: Methods and Models for Cognitive Engineering and Human-Computer Interaction

Alex Kirlik (Editor)

Research output: Book/Report/Conference proceedingBook

Abstract

In everyday life, and particularly in the modern workplace, information technology and automation increasingly mediate, augment, and sometimes even interfere with how humans interact with their environment. How to understand and support cognition in human-technology interaction is both a practically and socially relevant problem. The chapters in this volume frame this problem in adaptive terms: How are behavior and cognition adapted, or perhaps ill-adapted, to the demands and opportunities of an environment where interaction is mediated by tools and technology? The text draws heavily on the work of Egon Brunswik, a pioneer in ecological and cognitive psychology, as well as on modern refinements and extensions of Brunswikian ideas, including Hammond's Social Judgment Theory, Gigerenzer's Ecological Rationality and Anderson's Rational Analysis. Inspired by Brunswik's view of cognition as "coming to terms" with the "casual texture" of the external world, the chapters here provide quantitative and computational models and measures for studying how people come to terms with an increasingly technological ecology, and provide insights for supporting cognition and performance through design, training, and other interventions.

Original languageEnglish (US)
PublisherOxford University Press
Number of pages336
ISBN (Electronic)9780199847693
ISBN (Print)9780195374827
DOIs
StatePublished - May 2006

Keywords

  • Egon brunswik
  • Gigerenzer
  • Hammond
  • Human-technology interaction
  • Information technology
  • Technological ecology

ASJC Scopus subject areas

  • General Psychology

Fingerprint

Dive into the research topics of 'Adaptive Perspectives on Human-Technology Interaction: Methods and Models for Cognitive Engineering and Human-Computer Interaction'. Together they form a unique fingerprint.

Cite this