User Tools

Site Tools


reproducibility

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
reproducibility [2016/01/14 17:12]
xyepes
reproducibility [2017/11/10 14:03] (current)
fmassonn
Line 284: Line 284:
   * Optional:   * Optional:
     * Without -xHost     * Without -xHost
 +    * Without -fp-model clause
     * Try -fp-model source     * Try -fp-model source
     * Explore more processor combinations     * Explore more processor combinations
Line 301: Line 302:
   * 1 month, writing every day   * 1 month, writing every day
   * Use optimization to avoid mpi_allgather use at the northfold   * Use optimization to avoid mpi_allgather use at the northfold
 +
 +===== 4 February 2016 =====
 +Javier García-Serrano and Mario Acosta have showed some reproducibility results in the EC-earth meeting 2016. The community recommend us to finish the reproducibility experiments and publish the results. Some issues should be treated before:
 +
 +-Different combination of flags for optimization and floating-point operations have been checked in marenostrum3, bit for bit reproducibility had not been possible for EC-earth 3.2beta. However, bit for bit reproducibility could decrease performance, a combination of flags should be found in order to balance reproducibility, accuracy and performance. The next tasks should be discussed to achieve it:
 +
 +  * Determine the best method to quantify differences between runs
 +      * Propose a reference which we can use to compare the rest of experiments. This reference could be use in the future to check runs in new platforms, the inclusion of new modules, etc.
 +      * Use a statistical method to quantify the differences between runs and propose a minimum to achieve instead of bitwise precision in order to avoid critical restrictions in performance.
 +      * Propose a method to know which of two simulations with valid results is the best. Some experiments using different compiler flags will obtain similar valid results (maybe with differences of only 1%). It would be convenient to know which obtain better results (quality of the simulation results).
 +  * Determine a combination of flags (Floating-point control and optimization) and additional optimization methods which achieve a balance between performance and accuracy & reproducibility.
 +      * Suggest a combination of flags and/or implement some specific optimizations to achieve the best performance possible and at the same time the differences are less than X% using a particular platform and less than Y% using two different platforms with a similar architecture (being Y > X).
 +  * If bit for bit reproducibility was achieved using ec-earth3.1, study how to obtain it using ec-earth3.2beta at least in a debug mode.
 +
 +===== 27th of May 2016 =====
 +See the summarizing presentations of {{20160526_groupmeeting.pdf | François }} and {{20160526_EC-Earth3.2_MarioAcosta.pdf | Mario }}. A more general set of slides about climate-reproducibility is available {{ 20160526_EC-Earth3.1_FrancoisMassonnet.pdf | here }} and was also posted on the EC-Earth development portal issue {{https://dev.ec-earth.org/issues/207 | 207}}.
 +
 +Actions:
 +* Mario runs an experiment with **-fpe0** activated, on ECMWF.
 +* Mario/Oriol: Tests are to be made with libraries (NetCDF, GRIB, etc.) compiled with the same options and the same version of the code.
 +
 +===== 10th of November 2017 =====
 +Martin and François have worked to make the scripts testing the reproducibility more universal. These can now be found in the following gitlab project:
 +
 +https://earth.bsc.es/gitlab/fmassonnet/reproducibility.git
 +
 +A draft of the paper has been created:
 +
 +https://docs.google.com/document/d/1aMsdggygIGmbyiFmmEOEFIl6ZVe-EO7Jcd04B6ZP91A/edit
reproducibility.1452791567.txt.gz · Last modified: 2016/01/14 17:12 by xyepes