Toward a process model of explanation with implications for the type-token problem

John E. Hummel, David H. Landy, Derek Devnich

Research output: Chapter in Book/Report/Conference proceedingConference contribution


The ability to generate explanations plays a central role in human cognition. Generating explanations requires a deep conceptual understanding of the domain in question and tremendous flexibility in the way concepts are accessed and used. Together, these requirements constitute challenging design requirements for a model of explanation. We describe our progress toward providing a such a model, based on the LISA model of analogical inference (Hummel & Holyoak, 1997, 2003). We augment LISA with a novel representation of causal relations, and with an ability to flexibly combine knowledge from multiple sources in LTM without falling victim to the type-token problem. We demonstrate how the resulting model can serve as a starting point for an explicit process model of explanation.

Original languageEnglish (US)
Title of host publicationNaturally-Inspired Artificial Intelligence - Papers from the AAAI Fall Symposium, Technical Report
Number of pages8
StatePublished - Dec 1 2008
Event2008 AAAI Fall Symposium - Arlington, VA, United States
Duration: Nov 7 2008Nov 9 2008

Publication series

NameAAAI Fall Symposium - Technical Report


Other2008 AAAI Fall Symposium
Country/TerritoryUnited States
CityArlington, VA


  • Analogy
  • Explanation
  • LISA

ASJC Scopus subject areas

  • Engineering(all)

Cite this