Marcus Ritter, Alexandru Calotoiu, Thorsten Reimann, Torsten Hoefler, Felix Wolf:

Learning Cost-Effective Sampling Strategies for Empirical Performance Modeling

(presented in New Orleans, LA, USA, IEEE, May 2020, The 34th IEEE International Parallel & Distributed Processing Symposium (IPDPS'20) )

Abstract

Performance models are vital to the identification of scalability bottlenecks in parallel applications. They describe key performance metrics such as the execution time as a function of one or more parameters, such as the number of processes or the input size. Whereas analytical models must be laboriously derived from the source code by reasoning, their empirical siblings can be quickly learned from a set of performance experiments. Obviously, both the quality and the cost of empirical models depend on the design of the underlying experiments. Extra-P, a state-of-the-art modeling tool, requires experiments that represent all combinations of all parameter values. Hence, the number of samples it needs grows exponentially with the number of model parameters. In some situations, this makes empirical performance models impractical to create. In this paper, we propose a novel parameter value selection heuristic, which we adopt from a reinforcement learning agent, that needs only a polynomial number of samples and allows a more flexible experiment design. Using synthetic analysis and data from three different case studies, we show that our solution reduces the average modeling costs by up to 98% while retaining 99% of the model accuracy.

Documents

download article:

Recorded talk (best effort)

BibTeX

@inproceedings{rl_model, author={Marcus Ritter and Alexandru Calotoiu and Thorsten Reimann and Torsten Hoefler and Felix Wolf}, title={{Learning Cost-Effective Sampling Strategies for Empirical Performance Modeling}}, year={2020}, month={May}, location={New Orleans, LA, USA}, publisher={IEEE}, note={The 34th IEEE International Parallel \& Distributed Processing Symposium (IPDPS'20)}, source={http://www.unixer.de/~htor/publications/}, }