Wenn Sie diesen Text sehen, ist auf ihrem Gerät noch nicht das neue Layout geladen worden. Bitte laden Sie diese Seite neu (ggf. mit gedrückter 'Shift'- oder 'Alt'-Taste) oder in einem 'privaten Fenster'.
Weitere Hinweise unter https://www.uni-hildesheim.de/wiki/lsf/faq/fehler.im.layout.

Zur Seitennavigation oder mit Tastenkombination für den accesskey-Taste und Taste 1 
Zum Seiteninhalt oder mit Tastenkombination für den accesskey und Taste 2 
  1. SucheSuchen         
  2. SoSe 2024
  3. Hilfe
  4. Sitemap
Switch to english language
Startseite    Anmelden     

Using the Raspberry Pi and Docker for Replicable Performance Experiments: Experience Paper (Beitrag zu einer Tagung / Konferenz) - Einzelansicht


  • Funktionen:

Grunddaten

Titel der Arbeit (title) Using the Raspberry Pi and Docker for Replicable Performance Experiments: Experience Paper
Erscheinungsjahr 2018
Verlag (publisher) ACM
Buchtitel (booktitle) Proceedings of the 2018 ACM/SPEC International Conference on Performance Engineering
Seitenzahl (pages) 305-316
Publikationsart Beitrag zu einer Tagung / Konferenz
Inhalt
Abstract

Replicating software performance experiments is difficult. A common obstacle to replication is that recreating the hardware and software environments is often impractical. As researchers usually run their experiments on the hardware and software that happens to be available to them, recreating the experiments would require obtaining identical hardware, which can lead to high costs. Recreating the software environment is also difficult, as software components such as particular library versions might no longer be available. Cheap, standardized hardware components like the Raspberry Pi and portable software containers like the ones provided by Docker are a potential solution to meet the challenge of replicability. In this paper, we report on experiences from replicating performance experiments on Raspberry Pi devices with and without Docker and show that good replication results can be achieved for microbenchmarks such as JMH. Replication of macrobenchmarks like SPECjEnterprise 2010 proves to be much more difficult, as they are strongly affected by (non-standardized) peripherals. Inspired by previous microbenchmarking experiments on the Pi platform, we furthermore report on a systematic analysis of response time fluctuations, and present lessons learned on dos and don»ts for replicable performance experiments


Beteiligte Personen

Knoche, Holger
Eichelberger, Holger, Dr.  

Einrichtungen

Abt. Software Systems Engineering
Inst. für Informatik
Impressum      Datenschutzerklärung     Datenschutz      Datenschutzerklärung     Erklärung zur Barrierefreiheit