Write a Blog >>
MSR 2018
Mon 28 - Tue 29 May 2018 Gothenburg, Sweden
co-located with * ICSE 2018 *
Mon 28 May 2018 11:00 - 11:17 at E4 room - CI and Release Engineering Chair(s): Shane McIntosh

Continuous integration (CI) emphasizes quick feedback to developers. This is at odds with current practice of performance testing, which predominantly focuses on long-running tests against entire systems in production-like environments. Alternatively, software microbenchmarking attempts to establish a performance baseline for small code fragments in short time. This paper investigates the quality of microbenchmark suites with a focus on suitability to deliver quick performance feedback and CI integration. We study ten open-source libraries written in Java and Go with benchmark-suite sizes ranging from 16 to 983 tests, and runtimes between 11 minutes and 8.75 hours. We show that our study subjects include benchmarks with result variability of 50% or higher, indicating that not all benchmarks are useful for reliable discovery of slowdowns. We further artificially inject actual slowdowns into public API methods of the study subjects and test whether test suites are able to discover them. We introduce a performance-test quality metric called the API benchmarking score (ABS). ABS represents a benchmark suite’s ability to find slowdowns among a set of defined core API methods. Resulting benchmarking scores (i.e., fraction of discovered slowdowns) vary between 10% and 100% for the study subjects. This paper’s methodology and results can be used to (1) assess the quality of existing microbenchmark suites, (2) select a set of tests to be run as part of CI, and (3) suggest or generate benchmarks for currently untested parts of an API.

Mon 28 May
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

11:00 - 12:30: CI and Release EngineeringTechnical Papers at E4 room
Chair(s): Shane McIntoshMcGill University
11:00 - 11:17
Full-paper
An Evaluation of Open-Source Software Microbenchmark Suites for Continuous Performance Assessment
Technical Papers
A: Christoph LaaberUniversity of Zurich, A: Philipp LeitnerChalmers | University of Gothenburg
DOI Pre-print Media Attached
11:17 - 11:34
Full-paper
Studying the Impact of Adopting Continuous Integration on the Delivery Time of Pull Requests
Technical Papers
A: João Helis BernardoFederal Institute of Education, Science and Technology of Rio Grande do Norte, A: Daniel Alencar Da Costa Queen's University, Kingston, Ontario, A: Uirá Kulesza
Pre-print
11:34 - 11:51
Full-paper
What Did Really Change with the new Release of the App?
Technical Papers
A: Paolo Calciati IMDEA Software Institute, A: Konstantin KuznetsovSaarland University, CISPA, A: Xue Bai , A: Alessandra GorlaIMDEA Software Institute
11:51 - 12:08
Full-paper
CLEVER: A1:L96 Code Metrics with Clone Detection for Just-In-Time Fault Prevention and Resolution in Large Industrial Projects
Technical Papers
12:08 - 12:15
Short-paper
I'm Leaving You, Travis: A Continuous Integration Breakup Story
Technical Papers
A: David Gray WidderCarnegie Mellon University, A: Michael HiltonCarnegie Mellon University, USA, A: Christian K{\"a}stnerCarnegie Mellon University, A: Bogdan VasilescuCarnegie Mellon University
DOI Pre-print
12:15 - 12:30
Other
Discussion phase
Technical Papers