Introduction to turbulence/Stationarity and homogeneity
From CFD-Wiki
Processes statistically stationary in time
Many random processes have the characteristic that their statistical properties do not appear to depend directly on time, even though the random variables themselves are time-dependent. For example, consider the signals shown in Figures 2.2 and 2.5
When the statistical properties of a random process are independent of time, the random process is said to be stationary. For such a process all the moments are time-independent, e.g., , etc. In fact, the probability density itself is time-independent, as should be obvious from the fact that the moments are time independent.
An alternative way of looking at stationarity is to note that the statistics of the process are independent of the origin in time. It is obvious from the above, for example, that if the statistics of a process are time independent, then , etc., where is some arbitrary translation of the origin in time. Less obvious, but equally true, is that the product depends only on time difference and not on (or ) directly. This consequence of stationarity can be extended to any product moment. For example can depend only on the time difference . And can depend only on the two time differences and (or ) and not , or directly.
The autocorrelation
One of the most useful statistical moments in the study of stationary random processes (and turbulence, in particular) is the autocorrelation defined as the average of the product of the random variable evaluated at two times, i.e. . Since the process is assumed stationary, this product can depend only on the time difference . Therefore the autocorrelation can be written as:
| (1) |
The importance of the autocorrelation lies in the fact that it indicates the "memory" of the process; that is, the time over which is correlated with itself. Contrast the two autocorrelation of deterministic sine wave is simply a cosine as can be easily proven. Note that there is no time beyond which it can be guaranteed to be arbitrarily small since it always "remembers" when it began, and thus always remains correlated with itself. By contrast, a stationary random process like the one illustrated in the figure will eventually lose all correlation and go to zero. In other words it has a "finite memory" and "forgets" how it was. Note that one must be careful to make sure that a correlation really both goes to zero and stays down before drawing conclusions, since even the sine wave was zero at some points. Stationary random process always have two-time correlation functions which eventually go to zero and stay there.
Example 1.
Consider the motion of an automobile responding to the movement of the wheels over a rough surface. In the usual case where the road roughness is randomly distributed, the motion of the car will be a weighted history of the road's roughness with the most recent bumps having the most influence and with distant bumps eventually forgotten. On the other hand, if the car is travelling down a railroad track, the periodic crossing of the railroad ties represents a determenistic input an the motion will remain correlated with itself indefinitely, a very bad thing if the tie crossing rate corresponds to a natural resonance of the suspension system of the vehicle.
Since a random process can never be more than perfectly correlated, it can never achieve a correlation greater than is value at the origin. Thus
| (2) |
An important consequence of stationarity is that the autocorrelation is symmetric in the time difference . To see this simply shift the origin in time backwards by an amount and note that independence of origin implies:
| (3) |
Since the right hand side is simply , it follows immediately that: