Friday, July 13, 2018

Science Series Part I: Synchronisation of Chaotic Networks

I am going to run a three-part series about the science and mathematics underlying my summer research at the University of Maryland TREND program (Training and Research Experience in Nonlinear Dynamics). I am currently working on the experimental side of the field of nonlinear dynamics and chaos. I will explain this is as the series progresses. The posts will build up so that when I talk about my work in the third post, I can reference to the earlier ones. I will talk about the following subjects:

Part 1: Nonlinear and chaotic systems
Part 3: Nonlinear networks at my lab

What are Models?
Physics is somewhat different to other sciences: instead of seeking to categorise and inventorise the natural world, it seeks to reduce natural phenomena to their essence and describe the behaviour of these processes. Physics mainly proceeds with models. These are, in the most general sense, (quantitative) assumptions, which combined with mathematics, can be used to model these behaviours and make predictions. Often the hardest part is to find the right assumptions for one's model, and then the mathematics falls into place. Other problems have simple assumptions, but are harder on the mathematical side to solve. The hardest problems in general, are those with non-obvious, complicated effects combined with tricky mathematics.

Differential Equations
Arguably the most important tool in physics is the differential equation: this is an equation that describes how, based on the assumptions of a model, the different involved quantities (numbers) influence and change each other. A simple example is the following: the rate of cooling of a hot object is proportional to the difference in temperature between its own and that of its surroundings. That is to say, when a stone is 200 degrees hotter than its surroundings, it will cool twice as fast as if it were only 100 degrees hotter. This is perhaps obvious. What is however not so obvious is that the time for the stone to cool from 200C to 100C is much shorter than the time it takes the stone to cool the last 100 degrees, since when it is hot, it loses heat faster, and so the hotter stone will need less than twice the time to cool down fully. Differential equations are necessary to make this reasoning precise. They can describe almost anything: from falling, to the pressure in the atmosphere, to the neurons driving your heart.

An important part of differential equations are the so-called initial conditions. These are the quantities with which we program our system to start off with. For example, the location and velocity of a skydiver, or the initial temperature of our stone. Setting initial conditions and evolving the system permits us to make predictions about what happens later.
The differential equation describing the cooling down of a hot stone.
On the left the rate of change of the temperature is proportional to the negative of the temperature on the right.
So a hot object loses heat more quickly!


Types of Differential Equations
Differential Equations pop up everywhere, and often, their shape recurs. The DE (differential equation) describing the cooling down of an object, for example, is the same as the DE that describes the pressure in the atmosphere. The DE that describes falling, also describes braking of a car! These are examples of so-called linear differential equations: they have a property called linearity that makes them relatively easy to solve and handle. All those outside of this category are called nonlinear. The DE describing the firing of your neurons is nonlinear. Physicists like to approximate everything to be linear for simplicity, but truth be told, most things are not linear. That is why studying these nonlinear systems is so important. The field I am doing research in does exactly that. Nonlinear systems can show all sorts of interesting behaviour that linear ones cannot: they can blow-up: i.e. go to infinity in a finite amount of time, they can have different equilibriums and they may even exhibit chaos.
The class of Linear differential equations is a tiny subset of the whole

Chaos
Chaos is a phenomenon that pops up in some nonlinear systems. To put it briefly, a system is chaotic when it is generally fiddly. More precisely, chaos boils down to sensitive dependence on initial conditions (Strogatz). A hot soup will take approximately the same amount of time to become luke warm if it were 90C instead of 91C. This system is not chaotic. Florida being 27C rather than 26.9C in April can however matter very much. It can totally change the weather, perhaps even be the difference between a hurricane and a good summer's day a month's down the line. This is called the butterfly effect: given enough time for the effect to propagate, a butterfly flapping its wings somewhere may induce a storm in another place! Weather is very sensitive on initial conditions and so is definitely an example of a chaotic system. A fascinating example of chaos is called the double pendulum.

Poincaré: Stability of the Solar System
The first time that the subject of chaos and sensitive dependence on initial conditions came up, was when the Swedish King Oscar II initiated a mathematics competition to prove that the solar system is stable. Poincaré's partial solution won, but on revising his work, he discovered a mistake. He realised that in the long run, tiny deviations in the orbits of the planets, will cause two seemingly similar orbits to completely diverge. Since we cannot know the positions of the planets exactly, we will never be able to predict their movements into the far future. To this day, the stability of our solar system has not been mathematically proven! You can read more about this story and the work of other astronomers in "Newton's Clock: Chaos in the Solar System" by Ivars Peterson.

Three-body problem
The simple case that Poincaré examined is called the three-body problem: here we have three bodies, exerting gravity on each other, influencing each others movements. This is a famous Physics problem upon which many excellent mathematicians ranging from Newton to Lagrange have broken their teeth. Poincaré only examined the case of the so-called restricted three-body problem: here the third mass is a so-called 'test mass' moving in the same plane, meaning that it exerts no pull on the other bodies, and is completely at the mercy of their gravity, like a tiny satellite so-to-speak! As stated earlier, he found that tiny differences in the initial conditions would have huge differences down the line.

Restricted Three Body Problem simulation
I have made a Python program that simulates this problem, and exhibits chaos. I published the code here. It gives very interesting and attractive images! In my program, I placed two test masses very near each other. As the simulation progresses, the test masses diverge and the tiny initial distance grows exponentially, as can be seen on the graph. Once the distance between them reaches the top of the graph, they have completely separated and have started to move independently. This demonstrates sensitive dependence on initial conditions.
Here the Lyapunov exponent lambda is approximately 10/25s = 0.4 s-1 i.e. 50% divergence per second!
These divergence times may vary, for some orbits may be more sensitive to chaos than others. Below I have made a second simulation, with different starting conditions, where you can see that divergence can in fact happen a lot faster. Certain types of chaotic systems have the property that they diverge at an, on average, constant rate. These systems have a so-called Lyapunov Exponentλ, which quantifies this rate. It is essentially equal to the steepness of the line on the logarithmic divergence plot. When a system diverges at lower than exponential speed, it has a Lyapunov Exponent of zero. This is an alternative definition for a non-chaotic system, i.e. a chaotic system must have a positive exponent.
Here the Lyapunov exponent lambda is approximately 10/100s = 0.1 s-1 i.e. 10% divergence per second!
Chaos as a buzzword
The term ' Chaos'  gets thrown around a great deal. It is important to stress that a system being messy or complex does not necessarily meet the definition of chaos. Chaos is about very tiny differences in situations being amplified to totally different behaviours in the long run. This amplification must be exponentially fast. Below you can see the exponential equation giving the average growth of the difference between a system's solutions. If λ is negative, the solutions will converge, meaning that they will go to some equilibrium together, if λ is positive, the solutions will diverge, regardless of how close they are.

This equation describes the approximate exponential growth between two initially close solutions, where λ is the Lyapunov exponent

No comments:

Post a Comment