Selected Presentations

M. Croci, Mixed-precision explicit Runge-Kutta methods, IMA Leslie Fox Prize Event, Glasgow (2023). Download.

Motivated by the advent of machine learning, the last few years saw the return of hardware-supported reduced-precision computing. Computations with fewer digits are faster and more memory and energy efficient, but a careful implementation and rounding error analysis are required to ensure that sensible results can still be obtained. In this talk we focus on mixed-precision explicit Runge-Kutta (RK) methods. Mixed-precision algorithms combine low- and high-precision computations in order to benefit from the performance gains of reduced-precision while retaining good accuracy. We show that a naive mixed-precision implementation of RK schemes harms convergence and leads to error stagnation, and we present a more accurate alternative. We introduce new RK-Chebyshev schemes that only use $q\in{1,2}$ high-precision function evaluations to achieve a limiting convergence order of $O(\Delta t^{q})$, leaving the remaining evaluations in low precision. These methods are essentially as cheap as their fully low-precision equivalent and they are as accurate and (almost) as stable as their high-precision counterpart.