There is such a thing as not enough chaos.
In a new study, scientists have discovered that complex calculations performed by computers can be off by as much as 15 percent, due to a "pathological" inability to grasp the true mathematical complexity of chaotic dynamical systems.
"Our work shows that the behaviour of the chaotic dynamical systems is richer than any digital computer can capture," says computational scientist Peter Coveney from UCL in the UK.
"Chaos is more commonplace than many people may realise and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact."
For centuries, theorists have contemplated how very small effects can snowball into very large ones downstream.
In chaos theory, the phenomenon is famously referred to as the 'butterfly effect': metaphorically, the hypothetical notion that the infinitesimal flap of a butterfly's wings in one place could help generate a subsequent tornado in another.
It's a poetic concept, but while it may seem whimsical, mathematical modelling suggests the notion is grounded in very measurable terms.
The butterfly effect is primarily attributed to American mathematician and meteorologist Edward Norton Lorenz, who in the 1960s, while repeating a weather simulation, took a history-making shortcut: he used slightly simplified numbers for the second experiment (inputting 0.506 instead of 0.506127).
"I went down the hall for a cup of coffee and returned after about an hour, during which time the computer had simulated about two months of weather," Lorenz later recalled.
"The numbers being printed were nothing like the old ones."
The results of Lorenz's fateful rounding off showed how minute changes in initial conditions can produce big changes over time in complex, chaotic systems where lots of variables affect and influence one another.
Weather prediction is one example, but the same snowballing error phenomenon has since been demonstrated in everything from modelling orbital trajectories to turbulence and molecular dynamics.
The thing is, even though the butterfly effect has been known about for decades, it's still a fundamental problem in the way computers run calculations.
"Extreme sensitivity to initial conditions is a defining feature of chaotic dynamical systems," Coveney and his team explain in their new paper.
"Since the first usage of digital computers for computational science, it has been known that loss of precision due to the discrete approximation of real numbers can dramatically alter the dynamics of chaotic systems after a short amount of simulation time."
This loss of precision isn't something that becomes apparent in simple calculations. The calculator app on your smartphone is likely perfectly sufficient for everything you need it to do in daily life.
But in big calculations with lots of variables and starting conditions, tiny rounding errors at the outset can lead to huge calculation errors by the end of a given simulation.
At the heart of the problem, the researchers say, is what's called floating-point arithmetic: the standardised way that real numbers are understood by computers using binary code, which uses approximations and conversions to represent numbers.
In large and complex systems, those approximations can introduce significant errors – a problem compounded by the way that floating-point numbers are distributed between real numbers, even in a newer and more complex 64-bit format called double-precision floating-point.
"It has long been believed that the rounding errors are not problematic, especially if we use double-precision floating-point numbers – binary numbers using 64 bits, instead of 32," says mathematician Bruce Boghosian from Tufts University.
"But in our study, we have demonstrated a problem that is due to the uneven distribution of the fractions represented by the floating-point numbers, and that is not likely to disappear merely by increasing the number of bits."
In the research, the team compared a known simple chaotic system called the Bernoulli Map with a digital calculation of the same system, and uncovered what they say are "systematic distortions" and a "newfound pathology" in the simulation of chaotic dynamical systems.
Indeed, while Lorenz discovered his butterfly effect by leaving entire numerals out of a calculation, the researchers found their own much subtler equivalent by simply asking a computer to perform a mathematical calculation at all.
"For Lorenz, it was a very small change in the last few decimal places in the numbers used to start a simulation that caused his diverging results," Coveney told the Science Museum blog.
"What neither he nor others realised, and is highlighted in our new work, is that any such finite (rational) initial condition describes a behaviour which may be statistically highly unrepresentative."
While the researchers acknowledge that the Bernoulli Map is a simple chaotic system that isn't necessarily representative of more complex dynamic models, they warn that the insidious nature of their floating-point butterfly means no scientist should let their guard down around computers.
"We do not believe that practitioners should draw any comfort from the fact that their models are more complex than this one," the authors write.
"We would suggest that if so simple a system exhibits such egregious pathologies, a more complex system will probably exhibit even more devilish ones."
It's not every day you find out computer modelling may be fundamentally flawed. Until we somehow figure out a way to fix this, the team says researchers everywhere need to closely pay attention to the numbers their computers are spitting out.
The findings are reported in Advanced Theory and Simulations.