Reputation: 13
I have problem requiring me to calculate the rolling product of a series of 1 period returns. The length of the rolling window is variable. The purpose is to obtain the rolling product of the 1 period returns that covers as closely as possible a 12 months window.
I have been able to produce a working solution using brute force through for
loops and if
statements, however I'm wondering if there is an elegant solution. I have spent a lot of time trying with rollapply
and other similar functions but I haven't been able to obtain a solution.
The data below illustrates the problem.
date rt_1_period rt_12_mth_window
1 04-04-13 NA NA
2 10-04-13 0.729096362 NA
3 24-05-13 1.002535647 NA
4 30-05-13 0.993675716 NA
5 21-07-13 1.002662843 NA
6 03-08-13 1.009516582 NA
7 01-09-13 0.963099395 NA
8 20-10-13 1.012470278 NA
9 25-10-13 1.01308502 NA
10 03-11-13 1.005440704 NA
11 01-01-14 1.024208021 NA
12 11-01-14 0.996613924 NA
13 17-02-14 1.009811368 NA
14 24-02-14 1.008139557 NA
15 30-03-14 1.002794709 NA
16 30-04-14 0.998745849 1.042345473
17 02-05-14 1.002324076 1.044767963
18 27-06-14 0.997741026 1.046389027
19 24-08-14 1.015767546 1.050072129
20 05-09-14 1.014405005 1.106010894
21 02-11-14 1.013830296 1.09319212
22 09-11-14 1.013127219 1.101549487
23 16-11-14 1.012614177 1.115444628
24 18-01-15 0.986893629 1.078458006
25 24-01-15 1.028120919 1.108785236
26 10-04-15 0.912452762 0.991025615
27 09-08-15 1.004676152 0.981376513
28 07-01-16 1.004236123 0.934086003
29 01-04-16 1.02341302 0.94215696
In the example the 12 months return for row 29 is calculated as the product of 1 period return from row 26 to 29 because 02-04-15 (365 days from 01-04-16) is contained between rows 25 and 26. On the other side the 12 months return for row 15 is NA because 30-03-13 (365 days from 30-03-14) is outside the time window for which I have observable 1 period returns.
I would be glad if somebody could suggest some way of approaching this problem.
Just for clarity, if the data provided do not make much sense is because this is a cut down version of a larger database that I have created for illustration purposes.
Upvotes: 1
Views: 1020
Reputation: 176648
Here's a solution that only depends on xts, and it might be more straight-forward to some.
library(xts)
x <- as.xts(read.zoo(text="date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.013085020,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.023413020,0.94215696", header=TRUE, sep=",", format="%d-%m-%y"))
ix <- index(x) # index values
ixlag <- ix-365 # 1-year lag index values
x$rt_12 <- NA_real_ # initialize result column
for(i in which(ixlag > ix[1])) {
# 1-year subset
xyear <- window(x, start=ixlag[i], end=ix[i])
# calculate product and update result column
x[i,"rt_12"] <- prod(xyear[,"rt_1_period"])
}
Upvotes: 0
Reputation: 3597
You could use xts
and lubridate
for simplyfying date manipulations
Data:
require(xts)
require(lubridate)
DF = read.csv(text="
date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.01308502 ,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.02341302 ,0.94215696",header=TRUE,stringsAsFactors=FALSE,na.strings="")
#Convert to xts time series for ease in date manipulation
DF_xts = xts(DF[,-1],order.by = as.Date(DF[,1],format="%d-%m-%y"))
head(DF_xts)
#
# rt_1_period rt_12_mth_window
#2013-04-04 NA NA
#2013-04-10 0.729096362 NA
#2013-05-24 1.002535647 NA
#2013-05-30 0.993675716 NA
#2013-07-21 1.002662843 NA
#2013-08-03 1.009516582 NA
#set lag period as 1 year
lagPeriod = 1
Cumulative 12m Product:
For each date construct a window [prevYearDate,date], subset 1m returns lying in this window, calculate cumulative product and select last product
rt_12_mth_window_Calc = do.call(rbind,lapply(as.Date(index(DF_xts)),function(x) {
prevYearDate = x-years(lagPeriod)
rt_12_mth_window_Calc = last(cumprod(DF_xts[paste0(prevYearDate,"/",x),"rt_1_period"]))
colnames(rt_12_mth_window_Calc) = "rt_12_mth_window_Calc"
return(rt_12_mth_window_Calc)
}))
Final Dataset:
#Merge with original time series for final dataset
new_DF = merge.xts(DF_xts,rt_12_mth_window_Calc)
#Calculate difference in original and calculated 12 month returns
new_DF$delta = new_DF$rt_12_mth_window_Calc - new_DF$rt_12_mth_window
new_DF
# rt_1_period rt_12_mth_window rt_12_mth_window_Calc delta
#2013-04-04 NA NA NA NA
#2013-04-10 0.729096362 NA NA NA
#2013-05-24 1.002535647 NA NA NA
#2013-05-30 0.993675716 NA NA NA
#2013-07-21 1.002662843 NA NA NA
#2013-08-03 1.009516582 NA NA NA
#2013-09-01 0.963099395 NA NA NA
#2013-10-20 1.012470278 NA NA NA
#2013-10-25 1.013085020 NA NA NA
#2013-11-03 1.005440704 NA NA NA
#2014-01-01 1.024208021 NA NA NA
#2014-01-11 0.996613924 NA NA NA
#2014-02-17 1.009811368 NA NA NA
#2014-02-24 1.008139557 NA NA NA
#2014-03-30 1.002794709 NA NA NA
#2014-04-30 0.998745849 1.042345473 1.042345470 -2.64001643e-09
#2014-05-02 1.002324076 1.044767963 1.044767960 -2.54864396e-09
#2014-06-27 0.997741026 1.046389027 1.046389025 -1.97754613e-09
#2014-08-24 1.015767546 1.050072129 1.050072127 -1.66086833e-09
#2014-09-05 1.014405005 1.106010894 1.106010893 -1.34046041e-09
#2014-11-02 1.013830296 1.093192120 1.093192120 -6.47777387e-11
#2014-11-09 1.013127219 1.101549487 1.101549488 5.99306826e-10
#2014-11-16 1.012614177 1.115444628 1.115444628 -1.89856353e-10
#2015-01-18 0.986893629 1.078458006 1.078458005 -1.15637744e-09
#2015-01-24 1.028120919 1.108785236 1.108785235 -9.57268265e-10
#2015-04-10 0.912452762 0.991025615 0.991025613 -1.54581248e-09
#2015-08-09 1.004676152 0.981376513 0.996850412 1.54738992e-02
#2016-01-07 1.004236123 0.934086003 0.934086002 -9.15302278e-10
#2016-04-01 1.023413020 0.942156960 0.942156960 -1.82048598e-10
The calculated and orignal values are very close for all observations except for 2015-08-09, the deviation in values is 1.55%, could you confirm your calculations for this period
Upvotes: 1