Multi-Head Self-Attention Generative Adversarial Networks for Multiphysics Topology Optimization

Corey M. Parrott, Diab W. Abueidda, Kai A. James

Research output: Contribution to journalArticlepeer-review

Abstract

Machine learning surrogates for topology optimization must generalize well to a large variety of boundary conditions and volume fractions to serve as a stand-alone model. However, when analyzing design performance using physics-based analysis, many of the recently published methods suffer from low reliability, with a high percentage of the generated structures performing poorly. Disconnected regions of solid material between boundary conditions lead to unstable designs with significant outliers skewing the performance on test data. In this work, multi-head self-attention generative adversarial networks are introduced as a novel architecture for multiphysics topology optimization. This network contains multi-head attention mechanisms in high-dimensional feature spaces to learn the global dependencies of data (i.e., connectivity between boundary conditions). The model is demonstrated on design of coupled thermoelastic structures and its performance is evaluated with respect to the physics-based objective function used to generate training data. The proposed network achieves over a 36 times reduction in mean objective function error and an eight times reduction in volume fraction error compared to a baseline approach without attention mechanisms.

Original languageEnglish (US)
Pages (from-to)726-738
Number of pages13
JournalAIAA journal
Volume61
Issue number2
DOIs
StatePublished - Feb 2023

ASJC Scopus subject areas

  • Aerospace Engineering

Fingerprint

Dive into the research topics of 'Multi-Head Self-Attention Generative Adversarial Networks for Multiphysics Topology Optimization'. Together they form a unique fingerprint.

Cite this