Reputation: 3
I have a time series object suppose :
library(xts)
exposure <- xts(Google = c(100, 200,300,400,500,600,700,800),
Apple = c(10, 20,30,40,50,60,70,80),
Audi = c(1,2,3,4,5,6,7,8),
BMW = c(1000, 2000,3000,4000,5000,6000,7000,8000),
AENA = c(50,51,52,53,54,55,56,57,58),
order.by = Sys.Date() - 1:8)
I have a dataframe :
map <- data.frame(Company = c("Google", "Apple", " Audi", "BMW", " AENA"),
Country = c("US", "US", " GERMANY", "GERMANY", " SPAIN"))
I want to aggregate in exposure object based on the country to which the companies are mapped. Basically my output will be a xts object with same index as exposure but column names would be that of US, GERMANY, SPAIN. For example for a particular date under US column I would want sum of exposures for Google and Apple for that date.
Any help is welcome.
Upvotes: 0
Views: 240
Reputation:
I think there was an error with your original data specification. This is a way to do it by first moving it out of the xts
format and then back into it again.
I made a few changes to how the xts
object is created. I also cleaned some mistaken spaces up.
library(xts)
df <- data.frame(Google = c(100, 200,300,400,500,600,700,800),
Apple = c(10, 20,30,40,50,60,70,80),
Audi = c(1,2,3,4,5,6,7,8),
BMW = c(1000, 2000,3000,4000,5000,6000,7000,8000),
AENA = c(50,51,52,53,54,55,56,57))
exposure <- xts(df, order.by = Sys.Date() - 1:8)
map <- data.frame(Company = c("Google", "Apple", "Audi", "BMW", "AENA"),
Country = c("US", "US", "GERMANY", "GERMANY", "SPAIN"),
stringsAsFactors = F)
I use tbl2xts
to convert the format. Then, we use dplyr
and tidyr
to pivot the data to a long format, join in the Country to each Company, and summarize over the Country. We then convert back to xts
, spreading the data wide by Country.
library(tbl2xts)
library(dplyr)
library(tidyr)
xts_tbl(exposure) %>%
pivot_longer(-date, names_to = "Company") %>%
left_join(map, by = "Company") %>%
group_by(date, Country) %>%
summarize(value = sum(value)) %>%
ungroup() %>%
tbl_xts(spread_by = "Country")
GERMANY SPAIN US
2019-10-28 8008 57 880
2019-10-29 7007 56 770
2019-10-30 6006 55 660
2019-10-31 5005 54 550
2019-11-01 4004 53 440
2019-11-02 3003 52 330
2019-11-03 2002 51 220
2019-11-04 1001 50 110
Upvotes: 0