The Direction of Time
What determines the direction of time? At first, this seems to be a non-question: of course time flows in the direction it flows. But physicists deal with many ideal systems that run equally well in both direction. The physics of the past look just like the physics of the future. This is true of colliding billiard balls and very simple planetary systems. If the evolution of a physical system is symmetric in time, what makes time flow one way and not the other?
The common explanation is that when we look at complex (real) systems, the Second Law of thermodynamics, that entropy–approximately disorder–increases with the flow of time, dictates the direction of time. Our habit of seeing this everywhere explains the fascination with a movie of spilled milk being played backward. It is surreal to see the milk go back inside the glass as it reassembles for the shattered pieces.
This explanation of time’s direction is usually presented as a logical rather than causal explanation. Good for all you physics students who listened to your professors explanation of time’s direction and the Second Law and said, hu, what?
In Ilya Prigogine’s book The End of Certainty, he argues that the long-time explanation of the direction of time depending on the Second Law of thermodynamics requires an update. To illustrate the starting point, he builds a simple model with a deterministic (i.e. does the same thing given the same starting point) chaotic systems.
The argument behind the simulation explained below goes like this. There are structures that emerge and evolve in complex dynamical systems that (a) don’t have any iterations between the particles and (b) evolve asymmetrically in time. Thus, the Second Law is not adequate to explain the direction of time.
The simulation below uses a very simple chaotic system, the Bernoulli Map. The equation of motion for each of an ensemble of particles is x(n+1) = 2x(n) mod 1. The new position, x(n+1), is calculated from the old position, x(n), by multiplying by two, then removing the integer part so the new position is between 0 and 1.
Chaotic systems are characterized by exponential divergence of trajectories that start out close together. Contrast this with a pendulum. When we have two pendulums of the same length and mass and we start one off at a slightly different point from the other, they just swing back and forth slightly off synchronization. But the gap doesn’t grow over time.
In the Bernoulli Map, trajectories started out arbitrarily close diverge exponentially. The rate of divergence is measured by the Lyapunov exponent. The plot below shows the distance between two trajectories starting off very close and evolving according to the Bernoulli Map. The plot is on a semi-log scale; the slope of the line is the Lyapunov exponent.
Now we start many many particles together and watch the distribution of the particles along the x axis evolve. Notice that as time evolves (in the right direction!) the trajectories become uniformly distributed. This system does not evolve the other direction when run backwards, but rather continues to normalize the distribution. This is in spite of the fact that the particles do not interact or exchange energy (as they would in a gas coming to equilibrium, for example).
The Python code creating this example are available for download–Bernoulli Map.