Bowecho
Bowecho

Reputation: 909

Import and extract data from multiple APIs in R

I imported 2 APIs (Application Programming Interfaces) into R:

library(jsonlite)
library(dplyr)

a0 <- fromJSON("https://hello.com/users/0/bets") 
a1 <- fromJSON("https://hello.com/users/1/bets") 

I get a data frame from each API:

a0 <- df0
a1 <- df1 

Using bind_rows from dplyr library, I merged the two data frames into a single one (I can also use rbind.data.frame):

a <- bind_rows(a1, a2) 

I have 500 APIs in total, ranging from https://hello.com/users/0/bets to https://hello.com/users/499/bets, so I want to do what I've done above for all of them.

I can't seem to find a solution to this, so can anyone help?

Upvotes: 0

Views: 362

Answers (2)

camille
camille

Reputation: 16862

If you want to use a map function from purrr/tidyverse, you can do it in one line. map_dfr is a function that basically calls map to apply some function over a list (in this case, fromJSON), then calls reduce to apply bind_rows to all elements in the list, returning a single data frame.

If your URLs are, in fact, as simple as this with just an integer in them, you can map over 1:500, make a URL with that integer, then call fromJSON. I'm using stringr::str_glue to make the URLs, but if you have the actual ones in a different format, you can adjust.

library(tidyverse)
library(jsonlite)

map_dfr(1:500, ~fromJSON(str_glue("https://hello.com/users/{.}/bets")))

Upvotes: 0

DanY
DanY

Reputation: 6073

Something like this:

  • Create an empty list to store each data.frame you plan to grab
  • Loop over the URLs to grab the dataframes, storing each as a list element
  • Then do.call(rbind) all the dataframes together.

.

result_list <- vector(mode="list", length=500)
for(bet in 0:499) {
    this_url <- paste0("https://hello.com/users/", bet, "/bets")
    result_list[[bet]] <- fromJSON(this_url)
}
result <- do.call(rbind, result_list)

Upvotes: 1

Related Questions