Critique style guide: Improving crowdsourced design feedback with a natural language model

Markus Krause, Tom Garncarz, Jiao Jiao Song, Elizabeth M. Gerber, Brian P. Bailey, Steven P. Dow

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Designers are increasingly leveraging online crowds; yet, online contributors may lack the expertise, context, and sensitivity to provide effective critique. Rubrics help feedback providers but require domain experts to write them and may not generalize across design domains. This paper introduces and tests a novel semi-automated method to support feedback providers by analysing feedback language. In our first study, 52 students from two design courses created design solutions and received feedback from 176 online providers. Instructors, students, and crowd contributors rated the helpfulness of each feedback response. From this data, an algorithm extracted a set of natural language features (e.g., specificity, sentiment etc.) that correlated with the ratings. The features accurately predicted the ratings and remained stable across different raters and design solutions. Based on these features, we produced a critique style guide with feedback examples - automatically selected for each feature - to help providers revise their feedback through self-assessment. In a second study, we tested the validity of the guide through a between-subjects experiment (n=50). Providers wrote feedback on design solutions with or without the guide. Providers generated feedback with higher perceived helpfulness when using our style-based guidance.

Original languageEnglish (US)
Title of host publicationCHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems
Subtitle of host publicationExplore, Innovate, Inspire
PublisherAssociation for Computing Machinery
Pages4627-4639
Number of pages13
ISBN (Electronic)9781450346559
DOIs
StatePublished - May 2 2017
Event2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, United States
Duration: May 6 2017May 11 2017

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume2017-May

Other

Other2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
Country/TerritoryUnited States
CityDenver
Period5/6/175/11/17

Keywords

  • Critique
  • Crowdsourcing
  • Design
  • Expertise
  • Feedback
  • Rubrics

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Critique style guide: Improving crowdsourced design feedback with a natural language model'. Together they form a unique fingerprint.

Cite this