Reputation: 27
I read multiple urls from a site and now have a list of list of data frames (tbl). Now I need to full_join these lists of data frames into one data frame.
library(xml2)
library(dplyr)
#>
#> Attaching package: 'dplyr'
#> The following objects are masked from 'package:stats':
#>
#> filter, lag
#> The following objects are masked from 'package:base':
#>
#> intersect, setdiff, setequal, union
library(purrr)
library(rvest)
#>
#> Attaching package: 'rvest'
#> The following object is masked from 'package:purrr':
#>
#> pluck
url <- file.path(str_c("https://amx.am/en/9/trading/10/instruments", '?page=', 1:3))
#> Error in str_c("https://amx.am/en/9/trading/10/instruments", "?page=", : could not find function "str_c"
tbl <- lapply(url, read_html) %>% lapply(html_table)
#> Error in UseMethod("read_xml"): no applicable method for 'read_xml' applied to an object of class "name"
Created on 2020-01-25 by the reprex package (v0.3.0)
Upvotes: 0
Views: 372
Reputation: 389055
Each url
has multiple tables in them so to combine them together you can use
library(rvest)
library(purrr)
library(dplyr)
map(url, ~read_html(.x) %>% html_table) %>%
flatten() %>%
bind_rows()
Or if you want to full_join
them together, we can do that with reduce
map(url, ~read_html(.x) %>% html_table) %>%
flatten() %>%
reduce(full_join))
Upvotes: 1