{"id":953,"date":"2021-11-24T11:58:51","date_gmt":"2021-11-24T11:58:51","guid":{"rendered":"https:\/\/fooledbyrandomnessdotcom.wordpress.com\/?p=953"},"modified":"2022-01-26T14:08:46","modified_gmt":"2022-01-26T14:08:46","slug":"detecting-bs-in-correlation-windows","status":"publish","type":"post","link":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/2021\/11\/24\/detecting-bs-in-correlation-windows\/","title":{"rendered":"Detecting BS in Correlation Windows"},"content":{"rendered":"\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><a href=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1.png\"><img loading=\"lazy\" decoding=\"async\" width=\"1046\" height=\"765\" class=\"wp-image-843\" src=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1.png?w=1024\" alt=\"\" srcset=\"https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1.png 1046w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1-300x219.png 300w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1-1024x749.png 1024w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/picture1-768x562.png 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><\/a>\r\n<figcaption>Figuring out the <strong>sampling error <\/strong>of rolling correlation.<\/figcaption>\r\n<\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p class=\"has-drop-cap\">Financial theory requires correlation to be constant (or, at least, <strong>known and nonrandom<\/strong>). Nonrandom means predictable with waning sampling error over the period concerned. Ellipticality is a condition more necessary than thin tails, recall my Twitter fight with that non-probabilist Clifford Asness where I questioned not just his empirical claims and his real-life record, but his own theoretical rigor and the use by that idiot Antti Ilmanen of cartoon models to prove a point about tail hedging. Their entire business reposes on that ghost model of <strong>correlation-diversification<\/strong> from modern portfolio theory. The fight was interesting sociologically, but not technically. What is interesting technically is the thingy below.<\/p>\r\n\r\n\r\n\r\n<p>How do we extract sampling error of a rolling correlation? My coauthor and I could not find it in the literature so we derive the test statistics. <strong>The result<\/strong>: it has less than \\(10^{-17}\\) odds of being sampling error.<\/p>\r\n\r\n\r\n\r\n<p>The derivations are as follows:<\/p>\r\n\r\n\r\n\r\n<p>Let \\(X\\) and \\(Y\\) be \\(n\\) independent Gaussian variables centered to a mean \\(0\\). Let \\(\\rho_n(.)\\) be the operator.<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\">\\(\\rho_n(\\tau)= \\frac{X_\\tau Y_\\tau+X_{\\tau+1} Y_{\\tau+1}\\ldots +X_{\\tau+n-1} Y_{\\tau+n-1}}{\\sqrt{(X_{\\tau}^2+X_{\\tau+1}^2\\ldots +X_{\\tau+n-1}^2)(Y_\\tau^2+Y_{\\tau+1}^2\\ldots +Y_{\\tau+n-1}^2)}}.\\)\\<\/p>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p>First, we consider the distribution of the Pearson correlation for \\(n\\) observations of pairs assuming \\(\\mathbb{E}(\\rho) \\approx 0\\) (the mean is of no relevance as we are focusing on the second moment):<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\"><br \/>\\(f_n(\\rho)=\\frac{\\left(1-\\rho^2\\right)^{\\frac{n-4}{2}}}{B\\left(\\frac{1}{2},\\frac{n-2}{2}\\right)},\\)<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-\\left\"><br \/>with characteristic function:<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\">\\(\\chi_n(\\omega)=2^{\\frac{n-1}{2}-1} \\omega ^{\\frac{3-n}{2}} \\Gamma \\left(\\frac{n}{2}-\\frac{1}{2}\\right) J_{\\frac{n-3}{2}}(\\omega ),\\)<\/p>\r\n\r\n\r\n\r\n<p>where \\(J_{(.)}(.)\\) is the Bessel J function.<\/p>\r\n\r\n\r\n\r\n<p>We can assert that, for \\(n\\) sufficiently large: \\(2^{\\frac{n-1}{2}-1} \\omega ^{\\frac{3-n}{2}} \\Gamma \\left(\\frac{n}{2}-\\frac{1}{2}\\right) J_{\\frac{n-3}{2}}(\\omega ) \\approx e^{-\\frac{\\omega ^2}{2 (n-1)}},\\) the corresponding characteristic function of the Gaussian.<\/p>\r\n\r\n\r\n\r\n<p>Moments of order \\(p\\) become:<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\"><br \/>\\(M(p)= \\frac{\\left( (-1)^p+1\\right) \\Gamma \\left(\\frac{n}{2}-1\\right) \\Gamma \\left(\\frac{p+1}{2}\\right)}{2 \\left(\\frac{1}{2},\\frac{n-2}{2}\\right) \\Gamma \\left(\\frac{1}{2} (n+p-1)\\right)}\\)<\/p>\r\n\r\n\r\n\r\n<p>where \\(B(.,.)\\) is the Beta function. The standard deviation is \\(\\sigma_n=\\sqrt{\\frac{1}{n-1}}\\) and the kurtosis \\(\\kappa_n=3-\\frac{6}{n+1}\\).<\/p>\r\n\r\n\r\n\r\n<p>This allows us to treat the distribution of \\(\\rho\\) as Gaussian, and given infinite divisibility, derive the variation of the component, sagain of \\(O(\\frac{1}{n^2})\\) (hence simplify by using the second moment in place of the variance):.<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\">\\(\\rho_n\\sim \\mathcal{N}\\left(0,\\sqrt{\\frac{1}{n-1}}\\right).\\)<\/p>\r\n\r\n\r\n\r\n<p>To test how the second moment of the sample coefficient compares to that of a random series, and thanks to the assumption of a mean of \\(0\\), define the squares for nonoverlapping correlations:<\/p>\r\n\r\n\r\n\r\n<p class=\"has-text-align-center\">\\(\\Delta_{n,m}= \\frac{1}{m} \\sum_{i=1}^{\\lfloor m\/n\\rfloor} \\rho_n^2(i; n),\\)<\/p>\r\n\r\n\r\n\r\n<p>where \\(m\\) is the sample size and \\(n\\) is the correlation window. Now we can show that:<\/p>\r\n\r\n\r\n\r\n<p>\\(\\Delta_{n,m}\\sim \\mathcal{G}\\left(\\frac{p}{2},\\frac{2}{(n-1) p}\\right),\\)<br \/>where \\(p=\\lfloor m\/n\\rfloor\\) and \\(\\mathcal{G}\\) is the Gamma distribution with PDF:<br \/>\\(f(\\Delta)= \\frac{2^{-\\frac{p}{2}} \\left(\\frac{1}{(n-1) p}\\right)^{-\\frac{p}{2}} \\Delta ^{\\frac{p}{2}-1} e^{-\\frac{1}{2} \\Delta (n-1) p}}{\\Gamma \\left(\\frac{p}{2}\\right)},\\)<br \/>and survival function:<\/p>\r\n\r\n\r\n\r\n<p>\\(S(\\Delta)=Q\\left(\\frac{p}{2},\\frac{1}{2} \\Delta (n-1) p\\right),\\)<br \/>which allows us to obtain p-values below, using \\(m=714\\) observations (and using the leading order $O(.)$:<br \/><br \/><br \/><\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><a href=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1.png\"><img loading=\"lazy\" decoding=\"async\" width=\"1060\" height=\"376\" class=\"wp-image-936\" src=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1.png?w=1024\" alt=\"\" srcset=\"https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1.png 1060w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1-300x106.png 300w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1-1024x363.png 1024w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/screen-shot-2021-11-23-at-9.30.04-am-1-768x272.png 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><\/a><\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p>Such low p-values exclude any controversy as to their effectiveness cite{taleb2016meta}.<\/p>\r\n\r\n\r\n\r\n<p>We can also compare rolling correlations using a Monte Carlo for the null with practically the same results (given the exceedingly low p-values). We simulate \\(\\Delta_{n,m}^o\\) with overlapping observations:<br \/>\\(\\Delta_{n,m}^o= \\frac{1}{m} \\sum_{i=1}^{m-n-1} \\rho_n^2(i),\\)<\/p>\r\n\r\n\r\n\r\n<p>Rolling windows have the same second moment, but a mildly more compressed distribution since the observations of \\(\\rho\\) over overlapping windows of length \\(n\\) are autocorrelated (with, we note, an autocorrelation between two observations \\(i\\) orders apart of \\(\\approx 1-\\frac{1}{n-i}\\)). As shown in the figure below for \\(n=36\\) we get exceedingly low p-values of order \\(10^{-17}\\).<\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><a href=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/rollingcorr.jpg\"><img loading=\"lazy\" decoding=\"async\" width=\"360\" height=\"223\" class=\"wp-image-959\" src=\"http:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/rollingcorr.jpg?w=360\" alt=\"\" srcset=\"https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/rollingcorr.jpg 360w, https:\/\/fooledbyrandomness.com\/blog\/wp-content\/uploads\/2021\/11\/rollingcorr-300x186.jpg 300w\" sizes=\"auto, (max-width: 360px) 85vw, 360px\" \/><\/a><\/figure>\r\n","protected":false},"excerpt":{"rendered":"<p>&nbsp; &nbsp; Financial theory requires correlation to be constant (or, at least, known and nonrandom). Nonrandom means predictable with waning sampling error over the period concerned. Ellipticality is a condition more necessary than thin tails, recall my Twitter fight with that non-probabilist Clifford Asness where I questioned not just his empirical claims and his real-life &hellip; <a href=\"https:\/\/fooledbyrandomness.com\/blog\/index.php\/2021\/11\/24\/detecting-bs-in-correlation-windows\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Detecting BS in Correlation Windows&#8221;<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[4,13],"tags":[],"class_list":["post-953","post","type-post","status-publish","format-standard","hentry","category-probability","category-quant-finance"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/953","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/comments?post=953"}],"version-history":[{"count":2,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/953\/revisions"}],"predecessor-version":[{"id":996,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/953\/revisions\/996"}],"wp:attachment":[{"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/media?parent=953"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/categories?post=953"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/fooledbyrandomness.com\/blog\/index.php\/wp-json\/wp\/v2\/tags?post=953"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}