Abstract
Peer evaluations are a well-established tool for evaluating individual and team performance in collaborative contexts, but are susceptible to social and cognitive biases. Current peer evaluation tools have also yet to address the unique opportunities that online collaborative technologies provide for addressing these biases. In this work, we explore the potential of one such opportunity for peer evaluations: data traces automatically generated by collaborative tools, which we refer to as "activity traces". We conduct a between-subjects experiment with 101 students and MTurk workers, investigating the effects of reviewing activity traces on peer evaluations of team members in an online collaborative task. Our findings show that the usage of activity traces led participants to make more and greater revisions to their evaluations compared to a control condition. These revisions also increased the consistency and participants' perceived accuracy of the evaluations that they received. Our findings demonstrate the value of activity traces as an approach for performing more reliable and objective peer evaluations of teamwork. Based on our findings as well as qualitative analysis of free-form responses in our study, we also identify and discuss key considerations and design recommendations for incorporating activity traces into real-world peer evaluation systems.
Original language | English (US) |
---|---|
Article number | 151 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 7 |
Issue number | CSCW1 |
DOIs | |
State | Published - Apr 16 2023 |
Keywords
- activity traces
- online collaboration
- peer evaluation
- team assessment
ASJC Scopus subject areas
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications