In applied computational mathematics, numerical dispersion is a difficulty with computer simulations of continua (such as fluids) wherein the simulated medium exhibits a higher dispersivity than the true medium. This phenomenon can be particularly egregious when the system should not be dispersive at all, for example a fluid acquiring some spurious dispersion in a numerical model.
In simulations, time and space are divided into discrete grids and the continuous differential equations of motion (such as the Navier–Stokes equation) are discretized into finite-difference equations; these discrete equations are in general unidentical to the original differential equations, so the simulated system behaves differently than the intended physical system. The amount and character of the difference depends on the system being simulated and the type of discretization that is used.
- numerical dispersion. Glossary of the American Meteorological Society; page last modified on 26 January 2012, at 19:36.
- CHAPTER 5: Dissipation, Dispersion, and Group Velocity TREFETHEN