@simon.eismann

A case study on the stability of performance tests for serverless applications

, , , , , , and . Journal of Systems and Software (JSS), (2022)

Abstract

Context: While in serverless computing, application resource management and operational concerns are generally delegated to the cloud provider, ensuring that serverless applications meet their performance requirements is still a responsibility of the developers. Performance testing is a commonly used performance assessment practice; however, it traditionally requires visibility of the resource environment. Objective: In this study, we investigate whether performance tests of serverless applications are stable, that is, if their results are reproducible, and what implications the serverless paradigm has for performance tests. Method: We conduct a case study where we collect two datasets of performance test results: (a) repetitions of performance tests for varying memory size and load intensities and (b) three repetitions of the same performance test every day for ten months. Results: We find that performance tests of serverless applications are comparatively stable if conducted on the same day. However, we also observe short-term performance variations and frequent long-term performance changes. Conclusion: Performance tests for serverless applications can be stable; however, the serverless model impacts the planning, execution, and analysis of performance tests.

Links and resources

Tags

community

  • @se-group
  • @simon.eismann
  • @dblp
  • @samuel.kounev
@simon.eismann's tags highlighted