GiveMeLeverage
& I will remove the world
Taleb sulla convolution:
a) Time aggregation
Take the example of a distribution for daily returns that
has a finite second moment, but infinite kurtosis, say a
power-law with exponent <4, of the kind we observe
routinely in the markets. It will eventually, under time
aggregation, say if we lengthen the period to weekly,
monthly, or yearly returns, converge to a Gaussian. But
this will only happen at infinity. The distribution will
become increasingly Gaussian in the center, but not in
the tails. Bouchaud and Potters,2002, show how such
convergence will be extremely slow, at the rate of
sqrt(n log( n )) standard deviations, where n is the number
of observations. A distribution with a power law
exponent α >2, even with a million convolutions, will
eventually behave like a Gaussian up until about 3
standard deviations but conserve the power-law
attributes outside of such regime. So, at best we are
getting a mixed distribution, with the same fat tails as a
nonGaussian --and the tails are where the problems
reside.
More generally, the time-aggregation of probability
distributions with some infinite moment will not obey
the Central Limit Theorem in applicable time, thus
leaving us with non-asymptotic properties to deal with
in an effective manner Indeed it may not be even a
matter of time-window being too short, but for
distributions with finite second moment, but with an
infinite higher moment, for CLT to apply we need an
infinity of convolutions.
http://fooledbyrandomness.com/complexityAugust-06.pdf
a) Time aggregation
Take the example of a distribution for daily returns that
has a finite second moment, but infinite kurtosis, say a
power-law with exponent <4, of the kind we observe
routinely in the markets. It will eventually, under time
aggregation, say if we lengthen the period to weekly,
monthly, or yearly returns, converge to a Gaussian. But
this will only happen at infinity. The distribution will
become increasingly Gaussian in the center, but not in
the tails. Bouchaud and Potters,2002, show how such
convergence will be extremely slow, at the rate of
sqrt(n log( n )) standard deviations, where n is the number
of observations. A distribution with a power law
exponent α >2, even with a million convolutions, will
eventually behave like a Gaussian up until about 3
standard deviations but conserve the power-law
attributes outside of such regime. So, at best we are
getting a mixed distribution, with the same fat tails as a
nonGaussian --and the tails are where the problems
reside.
More generally, the time-aggregation of probability
distributions with some infinite moment will not obey
the Central Limit Theorem in applicable time, thus
leaving us with non-asymptotic properties to deal with
in an effective manner Indeed it may not be even a
matter of time-window being too short, but for
distributions with finite second moment, but with an
infinite higher moment, for CLT to apply we need an
infinity of convolutions.
http://fooledbyrandomness.com/complexityAugust-06.pdf
Ultima modifica: