An empirical evaluation and comparison of manual and automated test selection

Milos Gligoric, Stas Negara, Owolabi Legunsen, Darko Marinov

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Regression test selection speeds up regression testing by rerunning only the tests that can be affected by the most recent code changes. Much progress has been made on research in automated test selection over the last three decades, but it has not translated into practical tools that are widely adopted. Therefore, developers either re-run all tests after each change or perform manual test selection. Re-running all tests is expensive, while manual test selection is tedious and error-prone. Despite such a big trade-off, no study assessed how developers perform manual test selection and compared it to automated test selection. This paper reports on our study of manual test selection in practice and our comparison of manual and automated test selection. We are the first to conduct a study that (1) analyzes data from manual test selection, collected in real time from 14 developers during a three-month study and (2) compares manual test selection with an automated state-of-the-research test-selection tool for 450 test sessions. Almost all developers in our study performed manual test selection, and they did so in mostly ad-hoc ways. Comparing manual and automated test selection, we found the two approaches to select different tests in each and every one of the 450 test sessions investigated. Manual selection chose more tests than automated selection 73% of the time (potentially wasting time) and chose fewer tests 27% of the time (potentially missing bugs). These results show the need for better automated test-selection techniques that integrate well with developers' programming environments.

Original languageEnglish (US)
Title of host publicationASE 2014 - Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering
PublisherAssociation for Computing Machinery, Inc
Pages361-371
Number of pages11
ISBN (Electronic)9781450330138
DOIs
StatePublished - 2014
Event29th ACM/IEEE International Conference on Automated Software Engineering, ASE 2014 - Vasteras, Sweden
Duration: Sep 15 2014Sep 19 2014

Publication series

NameASE 2014 - Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering

Other

Other29th ACM/IEEE International Conference on Automated Software Engineering, ASE 2014
Country/TerritorySweden
CityVasteras
Period9/15/149/19/14

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'An empirical evaluation and comparison of manual and automated test selection'. Together they form a unique fingerprint.

Cite this