user2165857
user2165857

Reputation: 2710

Add MySQL query results to R dataframe

I want to convert a MySQL query from a python script to an analogous query in R. The python uses a loop structure to search for specific values using genomic coordinates:

SQL = """SELECT value FROM %s FORCE INDEX (chrs) FORCE INDEX (sites)
                        WHERE `chrom` = %d AND `site` = %d""" % (Table, Chr, Start)
cur.execute(SQL)

In R the chromosomes and sites are in a dataframe and for every row in the dataframe I would like to extract a single value and add it to a new column in the dataframe

So my current dataframe has a similar structure to the following:

df <- data.frame("Chr"=c(1,1,3,5,5), "Site"=c(100, 200, 400, 100, 300))

The amended dataframe should have an additional column with values from the database (at corresponding genomic coordinates. The structure should be similar to:

df <- data.frame("Chr"=c(1,1,3,5,5), "Site"=c(100, 200, 400, 100, 300), "Value"=c(1.5, 0, 5, 60, 100)

So far I connected to the database using:

con <- dbConnect(MySQL(),
                 user="root", password="",
                 dbname="MyDataBase")

Rather than loop over each row in my dataframe, I would like to use something that would add the corresponding value to a new column in the existing dataframe.

Update with working solution based on answer below:

library(RMySQL)
con <- dbConnect(MySQL(),
                 user="root", password="",
                 dbname="MyDataBase")

GetValue <- function(DataFrame, Table){
  queries <- sprintf("SELECT value as value 
                     FROM %s FORCE INDEX (chrs) FORCE INDEX (sites) 
                     WHERE chrom = %d AND site = %d UNION ALL SELECT 'NA' LIMIT 1", Table, DataFrame$Chr, DataFrame$start)
  res <- ldply(queries, function(query) { dbGetQuery(con, query)})
  DataFrame[, Table] <- res$value
  return(DataFrame)
}
df <- GetValue(df, "TableName")

Upvotes: 1

Views: 2243

Answers (4)

Walter
Walter

Reputation: 353

I like the data.table package for this kind of tasks as its syntax is inspired by SQL

require(data.table)

So an example database to match the values to a table

table <- data.table(chrom=rep(1:5, each=5), 
                    site=rep(100*1:5, times=5), 
                    Value=runif(5*5))

Now the SQL query can be translated into something like

# select from table, where chrom=Chr and site=Site, value
Chr <- 2
Site <- 200
table[chrom==Chr & site==Site, Value] # returns data.table
table[chrom==Chr & site==Site, ]$Value # returns numeric

Key (index) database for quick lookup (assuming unique chrom and site..)

setkey(table, chrom, site)
table[J(Chr, Site), ]$Value # very fast lookup due to indexed table

Your dataframe as data table with two columns 'Chr' and 'Site' both integer

df <- data.frame("Chr"=c(1,1,3,5,5), "Site"=c(100, 200, 400, 100, 300))
dt <- as.data.table(df) # adds data.table class to data.frame
setkey(dt, Chr, Site) # index for 'by' and for 'J' join

Match the values and append in new column (by reference, so no copying of table)

# loop over keys Chr and Site and find the match in the table
# select the Value column and create a new column that contains this
dt[, Value:=table[chrom==Chr & site==Site]$Value, by=list(Chr, Site)]
# faster:
dt[, Value:=table[J(Chr, Site)]$Value, by=list(Chr, Site)]
# fastest: in one table merge operation assuming the keys are in the same order
table[J(dt)]

kind greetings

Upvotes: 1

John St. John
John St. John

Reputation: 1592

Maybe you could do something like this. First, build up your queries, then execute them, storing the results in a column of your dataframe. Not sure if the do.call(rbind part is necessary, but that basically takes a bunch of dataframe rows, and squishes them together by row into a dataframe.

queries=sprintf("SELECT value as value FROM %s FORCE INDEX (chrs) FORCE INDEX (sites) WHERE chrom = %d AND site = %d UNION ALL SELECT 0 LIMIT 1", "TableName", df$Chrom, df$Pos)
df$Value = do.call("rbind",sapply(queries, function(query) dbSendQuery(mydb, query)))$value

I played with your SQL a little, my concern with the original is with cases where it might return more than 1 row.

Upvotes: 1

Jot eN
Jot eN

Reputation: 6426

You could easily use dplyr package. There is even nice vignette about that - http://cran.rstudio.com/web/packages/dplyr/vignettes/databases.html.

One thing you need to know is:

You can connect to MySQL and MariaDB (a recent fork of MySQL) through src_mysql(), mediated by the RMySQL package. Like PostgreSQL, you'll need to provide a dbname, username, password, host, and port.

Upvotes: 0

user2443147
user2443147

Reputation:

Why don't you use the RMySQL or sqldf package?

With RMySQL, you get MySQL access in R.

With sqldf, you can issue SQL queries on R data structures.

Using either of those, you do not need to reword you SQL query to get the same results.

Let me also mention the data.table package, which lets you do very efficient selects and joins on your data frames after converting them to data tables using as.data.table(your.data.frame). Another good thing about it is that a data.table object is a data.frame at the same time, so all your functions that work on the data frames work on these converted objects, too.

Upvotes: 0

Related Questions