Weird behavior in high frequency markets

Tagged , ,

Using a variety of identification methods and samples, I find that in most cases private spending falls significantly in response to an increase in government spending. These results imply that the average GDP multiplier lies below unity.

Alan V. Oppenheim’s lectures on Digital Signal Processing (1975)

The MIT has just uploaded Alan V. Oppenheim‘s DSP lectures on youtube.

See more here (and enjoy!)

Tagged , ,

Early time series

The graph (part of a commentary of Macrobius on Cicero’s In Somnium Scipionis) which dates from the tenth, possible eleventh century  was meant to represent a plot of the inclinations of the planetary orbits as a function of the time. The zone of the zodiac is given on a plane with the horizontal (time) axis divided into thirty parts and the ordinate representing the width of the zodiacal belt.

————————

Source: “Time-Series”, Sir Maurice Kendall

Tagged ,

Trend extraction and Detrending

Filters may be applied to a time series for a variety of reasons. Suppose that a time series consists of a long-term movement, a trend, upon which is superimposed an irregular component. A moving average filter will smooth the series revealing the trend more clearly.

Assume that $m_{t}$ is the filtered version of the $y_{t}$ series. Then

$m_{t} = M_{n}(L)y_{t} =\sum_{j=-r}^{j=r} w_{j}y_{t-j}$

The weights of a moving average filters add up to one i.e.  $M_{n}(L)=1$. The simplest such filter is the uniform moving average for which:

$w_{j} = \frac{1}{n} \; \; \; \; j=-r,...,r$

The gain of such filter is

$M_{n}(e^{-i \lambda}) = \left| \sum_{j=-r}^{r} \frac{1}{n} e^{-ij\lambda} \right| = \left| \frac{1}{n} \left( 1+2 \sum_{j=1}^{r} cos \lambda_{j} \right) \right| = \left| \frac{sin (n \lambda /2)}{n sin (\lambda /2)} \right|$

Gain of uniform moving average filter

The uniform moving average filter (applied on artificial data)

The Moving Average filter removes a cycle of period n together with its harmonics.

Tagged , , ,

Nonlinearities and Thresholds

Jökulsá á Fjöllum nonlinearities

The infamous nonlinearity first observed by  Tong et al. (1985).  This nonlinearity is the effect of the melting of glaciers in the catchment area of Jökulsá á Fjöllum (Jokulsa River) on the latter’s flow.

Tagged , ,

Why is the Hodrick-Prescott filter often inappropriate?

The Hodrick-Prescott (HP) filter is the optimal estimator of the trend component in a smooth trend model with signal-to-noise ratio parameter fixed at 1/1600. It gives the detrended observations, $X_{t}$,  for large samples and t not near the beginning or end of the series

$\displaystyle X_{t}= \left[ \frac{(1-L)^{2}(1-L^{-1})^{2}}{\bar{q}_{\zeta}+ (1-L)^{2}(1-L^{-1})^{2}}\right] Y_{t}$

where $\bar{q}_{\zeta}= \sigma_{\zeta}^{2} / \sigma_{\epsilon}^{2}$.

Bear in mind that if the  smooth trend model was believed to be the true model there would be no reason to apply the HP filter. The filtered data of a smooth trend model contain nothing more than white noise. The belief here is clearly different.

We can easily show that the gain from the detrending filter is given by:

$G(\lambda) = \frac{4(1-cos \lambda)^{2}}{\bar{q}_{\zeta}+4(1-cos \lambda)^{2}} = \frac{16sin^{4}( \lambda /2)}{\bar{q}_{\zeta}+16sin^{4}( \lambda /2)}$

Note that the smaller the ${\bar{q}_{\zeta}}$ the more the filter concentrates on removing low frequencies.

Gain for HP filter

Some notes on Linear Filters

Let $\{ X_t \}$ and  $\{ Y_t \}$ be two stationary time series related by:

$X_{t} = M_{n}(L)Y_{t} =\sum_{j=- \infty}^{j=\infty} g_{j}Y_{t-j}$

where

$\sum_{j=- \infty}^{j=\infty}|g_{j}| < \infty$ and  $\sum_{j=- \infty}^{j=\infty}|g_{j}|^2 < \infty$

$\{ X_t \}$ is the filtered version of  $\{ Y_t \}$ and $M_{n}(L)$ is the filter. The effect of a linear filter is to change the importance of various cyclical components of the series and/or induce a shift with regard to the position in time.

Tagged , , ,

A “Turing test” of human perception

Hasanhodzic, Lo and Viola constructed a “Turing Test” to assess the ability of human subjects to distinguish between actual and randomly generated returns of financial securities. The researchers found that there is significant statistical evidence that humans can consistently distinguish between actual and artificial series and that the belief that financial markets “look random” (i.e. random walks) is hence probably erroneous.

The human eye has indeed a significant advantage over most current computer algorithms on several classification and image recognition tasks and that could probably explain the findings (at least partly). However, would the results have been the same had the researchers used more sophisticated simulations (to mimic stylized facts of financial markets)?

p.s. A link to the test can be found here

“High frequency traders do ‘risk’ better”

Some recent findings by Dobrislav P. Dobrev and Pawel J. Szerszen.