From in the class or in the wild? Peers provide better design feedback than external crowds

Helen Wauck, Yu Chun Yen, Wai-Tat Fu, Elizabeth Gerber, Steven P. Dow, Brian P Bailey

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

As demand for design education increases, instructors are struggling to provide timely, personalized feedback for student projects. Gathering feedback from classroom peers and external crowds offer scalable approaches, but there is little evidence of how they compare. We report on a study in which students (n=127) created early- and late-stage prototypes as part of nine-week projects. At each stage, students received feedback from peers and external crowds: their own social networks, online communities, and a task market. We measured the quality, quantity and valence of the feedback and the actions taken on it, and categorized its content using a taxonomy of critique discourse. The study found that peers produced feedback that was of higher perceived quality, acted upon more, and longer compared to the crowds. However, crowd feedback was found to be a viable supplement to peer feedback and students preferred it for projects targeting specialized audiences. Feedback from all sources spanned only a subset of the critique categories. Instructors may fill this gap by further scaffolding feedback generation. The study contributes insights for how to best utilize different feedback sources in project-based courses.

Original languageEnglish (US)
Title of host publicationCHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems
Subtitle of host publicationExplore, Innovate, Inspire
PublisherAssociation for Computing Machinery
Pages5580-5591
Number of pages12
ISBN (Electronic)9781450346559
DOIs
StatePublished - May 2 2017
Event2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, United States
Duration: May 6 2017May 11 2017

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume2017-May

Other

Other2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
CountryUnited States
CityDenver
Period5/6/175/11/17

Fingerprint

Feedback
Students
Taxonomies
Education

Keywords

  • Crowdsourcing
  • Design methods
  • Feedback
  • Learning

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Cite this

Wauck, H., Yen, Y. C., Fu, W-T., Gerber, E., Dow, S. P., & Bailey, B. P. (2017). From in the class or in the wild? Peers provide better design feedback than external crowds. In CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems: Explore, Innovate, Inspire (pp. 5580-5591). (Conference on Human Factors in Computing Systems - Proceedings; Vol. 2017-May). Association for Computing Machinery. https://doi.org/10.1145/3025453.3025477

From in the class or in the wild? Peers provide better design feedback than external crowds. / Wauck, Helen; Yen, Yu Chun; Fu, Wai-Tat; Gerber, Elizabeth; Dow, Steven P.; Bailey, Brian P.

CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems: Explore, Innovate, Inspire. Association for Computing Machinery, 2017. p. 5580-5591 (Conference on Human Factors in Computing Systems - Proceedings; Vol. 2017-May).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wauck, H, Yen, YC, Fu, W-T, Gerber, E, Dow, SP & Bailey, BP 2017, From in the class or in the wild? Peers provide better design feedback than external crowds. in CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems: Explore, Innovate, Inspire. Conference on Human Factors in Computing Systems - Proceedings, vol. 2017-May, Association for Computing Machinery, pp. 5580-5591, 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017, Denver, United States, 5/6/17. https://doi.org/10.1145/3025453.3025477
Wauck H, Yen YC, Fu W-T, Gerber E, Dow SP, Bailey BP. From in the class or in the wild? Peers provide better design feedback than external crowds. In CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems: Explore, Innovate, Inspire. Association for Computing Machinery. 2017. p. 5580-5591. (Conference on Human Factors in Computing Systems - Proceedings). https://doi.org/10.1145/3025453.3025477
Wauck, Helen ; Yen, Yu Chun ; Fu, Wai-Tat ; Gerber, Elizabeth ; Dow, Steven P. ; Bailey, Brian P. / From in the class or in the wild? Peers provide better design feedback than external crowds. CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems: Explore, Innovate, Inspire. Association for Computing Machinery, 2017. pp. 5580-5591 (Conference on Human Factors in Computing Systems - Proceedings).
@inproceedings{83461bdb203b4643be114f47a7c02135,
title = "From in the class or in the wild? Peers provide better design feedback than external crowds",
abstract = "As demand for design education increases, instructors are struggling to provide timely, personalized feedback for student projects. Gathering feedback from classroom peers and external crowds offer scalable approaches, but there is little evidence of how they compare. We report on a study in which students (n=127) created early- and late-stage prototypes as part of nine-week projects. At each stage, students received feedback from peers and external crowds: their own social networks, online communities, and a task market. We measured the quality, quantity and valence of the feedback and the actions taken on it, and categorized its content using a taxonomy of critique discourse. The study found that peers produced feedback that was of higher perceived quality, acted upon more, and longer compared to the crowds. However, crowd feedback was found to be a viable supplement to peer feedback and students preferred it for projects targeting specialized audiences. Feedback from all sources spanned only a subset of the critique categories. Instructors may fill this gap by further scaffolding feedback generation. The study contributes insights for how to best utilize different feedback sources in project-based courses.",
keywords = "Crowdsourcing, Design methods, Feedback, Learning",
author = "Helen Wauck and Yen, {Yu Chun} and Wai-Tat Fu and Elizabeth Gerber and Dow, {Steven P.} and Bailey, {Brian P}",
year = "2017",
month = "5",
day = "2",
doi = "10.1145/3025453.3025477",
language = "English (US)",
series = "Conference on Human Factors in Computing Systems - Proceedings",
publisher = "Association for Computing Machinery",
pages = "5580--5591",
booktitle = "CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems",

}

TY - GEN

T1 - From in the class or in the wild? Peers provide better design feedback than external crowds

AU - Wauck, Helen

AU - Yen, Yu Chun

AU - Fu, Wai-Tat

AU - Gerber, Elizabeth

AU - Dow, Steven P.

AU - Bailey, Brian P

PY - 2017/5/2

Y1 - 2017/5/2

N2 - As demand for design education increases, instructors are struggling to provide timely, personalized feedback for student projects. Gathering feedback from classroom peers and external crowds offer scalable approaches, but there is little evidence of how they compare. We report on a study in which students (n=127) created early- and late-stage prototypes as part of nine-week projects. At each stage, students received feedback from peers and external crowds: their own social networks, online communities, and a task market. We measured the quality, quantity and valence of the feedback and the actions taken on it, and categorized its content using a taxonomy of critique discourse. The study found that peers produced feedback that was of higher perceived quality, acted upon more, and longer compared to the crowds. However, crowd feedback was found to be a viable supplement to peer feedback and students preferred it for projects targeting specialized audiences. Feedback from all sources spanned only a subset of the critique categories. Instructors may fill this gap by further scaffolding feedback generation. The study contributes insights for how to best utilize different feedback sources in project-based courses.

AB - As demand for design education increases, instructors are struggling to provide timely, personalized feedback for student projects. Gathering feedback from classroom peers and external crowds offer scalable approaches, but there is little evidence of how they compare. We report on a study in which students (n=127) created early- and late-stage prototypes as part of nine-week projects. At each stage, students received feedback from peers and external crowds: their own social networks, online communities, and a task market. We measured the quality, quantity and valence of the feedback and the actions taken on it, and categorized its content using a taxonomy of critique discourse. The study found that peers produced feedback that was of higher perceived quality, acted upon more, and longer compared to the crowds. However, crowd feedback was found to be a viable supplement to peer feedback and students preferred it for projects targeting specialized audiences. Feedback from all sources spanned only a subset of the critique categories. Instructors may fill this gap by further scaffolding feedback generation. The study contributes insights for how to best utilize different feedback sources in project-based courses.

KW - Crowdsourcing

KW - Design methods

KW - Feedback

KW - Learning

UR - http://www.scopus.com/inward/record.url?scp=85044845401&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85044845401&partnerID=8YFLogxK

U2 - 10.1145/3025453.3025477

DO - 10.1145/3025453.3025477

M3 - Conference contribution

AN - SCOPUS:85044845401

T3 - Conference on Human Factors in Computing Systems - Proceedings

SP - 5580

EP - 5591

BT - CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery

ER -