A parallel agent-based model of land use opinions

Wenwu Tanga, David A. Bennett, Shaowen Wanga

Research output: Contribution to journalArticlepeer-review

Abstract

In this article we present a parallel agent-based model (ABM) to support large-scale simulations of land use change. ABMs are a commonly used simulation approach for the investigation of land use systems. The computationally intense nature of these models, however, often prohibits the development of models that fully capture the complex dynamics of land use systems when using typical desktop computing environments. The search for scientific understanding and solutions to real-world problems is, therefore, often limited by an inability to explore a wide range of scales or the impact of complex interactions. Parallel computing provides a potential solution for this limitation issue. Our ABM is designed using parallel computing to simulate the formation of large-scale land use opinions within spatially explicit environments. Agents, environments, and interactions among agents are distributed among processors through parallel computing strategies, including spatial domain decomposition, ghost zones, and synchronization. We examine the computational performance of the model within a supercomputing environment. It is demonstrated that by leveraging increasingly available high-performance parallel computing resources large-scale ABMs of land use systems can be developed and, ultimately, underlying processes that drive these systems better understood.

Original languageEnglish (US)
Pages (from-to)121-135
Number of pages15
JournalJournal of Land Use Science
Volume6
Issue number2-3
DOIs
StatePublished - Jun 2011

Keywords

  • Agent-based models
  • Opinion modeling
  • Parallel computing

ASJC Scopus subject areas

  • Geography, Planning and Development
  • Earth-Surface Processes
  • Management, Monitoring, Policy and Law

Fingerprint Dive into the research topics of 'A parallel agent-based model of land use opinions'. Together they form a unique fingerprint.

Cite this