Abstract
Peer evaluations are critical for assessing teams, but are susceptible to bias and other factors that undermine their reliability. At the same time, collaborative tools that teams commonly use to perform their work are increasingly capable of logging activity that can signal useful information about individual contributions and teamwork. To investigate current and potential uses for activity traces in peer evaluation tools, we interviewed (N=11) and surveyed (N=242) students and interviewed (N=10) instructors at a single university. We found that nearly all of the students surveyed considered specific contributions to the team outcomes when evaluating their teammates, but also reported relying on memory and subjective experiences to make the assessment. Instructors desired objective sources of data to address challenges with administering and interpreting peer evaluations, and have already begun incorporating activity traces from collaborative tools into their evaluations of teams. However, both students and instructors expressed concern about using activity traces due to the diverse ecosystem of tools and platforms used by teams and the limited view into the context of the contributions. Based on our findings, we contribute recommendations and a speculative design for a data-centric peer evaluation tool.
Original language | English (US) |
---|---|
Article number | 432 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 5 |
Issue number | CSCW2 |
Early online date | Oct 18 2021 |
DOIs | |
State | Published - Oct 18 2021 |
Keywords
- activity traces
- education
- peer evaluation
- team assessment
- teams
- teamwork
ASJC Scopus subject areas
- Social Sciences (miscellaneous)
- Human-Computer Interaction
- Computer Networks and Communications