Key focus: Learn how to generate color noise using auto regressive (AR) model. Apply Yule Walker equations for generating power law noises: pink noise, Brownian noise.
Auto-Regressive (AR) model
An uncorrelated Gaussian random sequence can be transformed into a correlated Gaussian random sequence using an AR time-series model. If a time series random sequence is assumed to be following an auto-regressive model of form,
where is the uncorrelated Gaussian sequence of zero mean and variance , the natural tendency is to estimate the model parameters . Least Squares method can be applied here to find the model parameters, but the computations become cumbersome as the order increases. Fortunately, the AR model coefficients can be solved for using Yule Walker equations.
Yule Walker equations relate auto-regressive model parameters to the auto-correlation of random process . Finding the model parameters using Yule-Walker equations, is a two step process:
1. Given , estimate auto-correlation of the process . If is already specified as a function, utilize it as it is (see auto-correlation equations for Jakes spectrum or Doppler spectrum in section 11.3.2 in the book).
2. Solve Yule Walker equations to find the model parameters and the noise sigma .
Yule-Walker equations
Yule-Walker equations can be compactly written as
Written in matrix form, the Yule-Walker equations that comprises of a set of linear equations and unknown parameters.
Representing equation (3) in a compact form,
The AR model parameters can be found by solving
After solving for the model parameters , the noise variance can be found by applying the estimated values of in equation (2) by setting . The aryule command (in Matlab and Python’s spectrum package) efficiently solves the Yule-Walker equations using Levinson Algorithm [1][2]. Once the model parameters are obtained, the AR model can be implemented as an \emph{infinte impulse response (IIR)} filter of form
Example: power law noise generation
The power law in the power spectrum characterizes the fluctuating observables in many natural systems. Many natural systems exhibit some noise which is a stochastic process with a power spectral density having a power exponent that can take values . Simply put, noise is a colored noise with a power spectral density of over its entire frequency range.
The noise can be classified into different types based on the value of .
Violet noise – = -2, the power spectral density is proportional to .
Blue noise – = -1, the power spectral density is proportional to .
White noise – = 0, the power spectral density is flat across the whole spectrum.
Pink noise – = 1, the power spectral density is proportional to , i.e, it decreases by per octave with increase in frequency.
Brownian noise – = 2, the power spectral density is proportional to , therefore it decreases by per octave with increase in frequency.
The power law noise can be generated by sequencing a zero-mean white noise through an auto-regressive (AR) filter of order :
where, is a zero-mean white noise process. Referring the AR generation method described in [3], the coefficients of the AR filter can be generated as
which can be implemented as an infinite impulse response (IIR) filter using the filter transfer function described in equation (6).
The following script implements this method and the sample results are plotted in the next Figure.
Refer the book for the Matlab code
Rate this article:
References
[1] Gene H. Golub, Charles F. Van Loan, Matrix Computations, ISBN-9780801854149, Johns Hopkins University Press, 1996, p. 143.↗
[2] J. Durbin, The fitting of time series in models, Review of the International Statistical Institute, 28:233-243, 1960.↗
[3] Kasdin, N.J. Discrete Simulation of Colored Noise and Stochastic Processes and Power Law Noise Generation, Proceedings of the IEEE, Vol. 83, No. 5, 1995, pp. 802-827.↗
Rate this article:
Books by the author
Topics in this chapter
Random Variables - Simulating Probabilistic Systems ● Introduction ● Plotting the estimated PDF ● Univariate random variables □ Uniform random variable □ Bernoulli random variable □ Binomial random variable □ Exponential random variable □ Poisson process □ Gaussian random variable □ Chi-squared random variable □ Non-central Chi-Squared random variable □ Chi distributed random variable □ Rayleigh random variable □ Ricean random variable □ Nakagami-m distributed random variable ● Central limit theorem - a demonstration ● Generating correlated random variables □ Generating two sequences of correlated random variables □ Generating multiple sequences of correlated random variables using Cholesky decomposition ● Generating correlated Gaussian sequences □ Spectral factorization method □ Auto-Regressive (AR) model |
Interesting to see the link between autocorrelation and the exponent of the PSD.
I implemented the code in the book on page 99.
How would I go about characterizing alpa from the PSD?
If I fit a line in loglog space to the PSD, this line is severely biased by the noise lower end of the PSD.
Example code:
log_f = 10log10(F(2:end));
log_p = 10log10(Pyy(2:end));
Const = polyfit(log_f,log_p,1);
slope = Const(1);
offset = Const(2);
Yp = polyval(Const, log_f);
Do you have any example of a robust estimate of the slope and offset following the example noise generation?
This method can only be applied for noises that has certain statistical properties. The relationship between alpha, PSD and the AR model cannot be obtained by a simple line-fit model. It has to be estimated from empirical data.
The noise should exhibit fractal phenomenon (long memory and self-similarity) for this method to work. Following references may help you further.
[1] Jan Beran, “Statistics for Long-Memory Processes “, ISBN-13: 978-0412049019, Chapman and Hall/CRC; 1 edition (October 1, 1994) URL: https://amzn.to/2w9I0NO
[2] Stadnitski, Tatjana. “Measuring fractality.” Frontiers in physiology vol. 3 127. 7 May. 2012, doi:10.3389/fphys.2012.00127 URL: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3345945/