This project involves determining what attributes of a computer system can cause numerical reproducibility to fail and how the uncertainty resulting from this can be quantified.
The Information Systems group in ITL is developing a program in numerical reproducibility, which has been funded through several projects. Issues with numerical reproducibility are becoming increasingly significant, for several reasons
Initial work was done through a project on Terascale Imaging. This work examined the numerical reproducibility of several primitive image-processing algorithms such as Fourier Transforms and linear algebra operations. These algorithms were executed on images using a variety of library versions, floating-point precisions, compiler options, CPU and GPU architectures, etc. The project cataloged which combinations of attributes caused breaks in bit-wise numerical reproducibility of the results.
Recent projects have been funded through internal ITL Building the Future grants. These projects have focused on developing practical methods to quantify the uncertainty associated with floating-point operations in scientific computations. Several methods currently exist for quantifying this uncertainty, but they are not generally used in practice, mostly due to performance penalties associated with their use. These projects are attempting to lay the groundwork for accelerated methods, such as through hardware acceleration, by demonstrating the benefits of this uncertainty quantification for reproducibility and providing examples of how it could be used in practice.
[1] http://cadna.lip6.fr/index.php
[2] https://edf-hpc.github.io/verrou/index.html
[3] https://github.com/verificarlo/verificarlo