Kiba
Kiba

Reputation: 10827

Is there a simple way to convert data base rows to JSON in Golang

Currently on what I've seen so far is that, converting database rows to JSON or to []map[string]interface{} is not simple. I have to create two slices and then loop through columns and create keys every time.

...Some code

tableData := make([]map[string]interface{}, 0)
  values := make([]interface{}, count)
  valuePtrs := make([]interface{}, count)
  for rows.Next() {
      for i := 0; i < count; i++ {
          valuePtrs[i] = &values[i]
      }
      rows.Scan(valuePtrs...)
      entry := make(map[string]interface{})
      for i, col := range columns {
          var v interface{}
          val := values[i]
          b, ok := val.([]byte)
          if ok {
              v = string(b)
          } else {
              v = val
          }
          entry[col] = v
      }
      tableData = append(tableData, entry)
  }

Is there any package for this ? Or I am missing some basics here

Upvotes: 8

Views: 7930

Answers (5)

pimguilherme
pimguilherme

Reputation: 1050

One quick way to go about being able to get an arbirtrary and generic []map[string]interface{} from these query libraries is to populate an array of interface pointers with the same size of the amount of columns on the query, and then pass that as a parameter on the scan function:

// For example, for the go-mssqldb lib:

queryResponse, err := d.pool.Query(query)
if err != nil {
    return nil, err
}
defer queryResponse.Close()

// Holds all the end-results
results := []map[string]interface{}{}

// Getting details about all the fields from the query
fieldNames, err := queryResponse.Columns()
if err != nil {
    return nil, err
}

// Creating interface-type pointers within an array of the same
// size of the number of columns we have, so that we can properly
// pass this to the "Scan" function and get all the query parameters back :)
var scanResults []interface{}
for range fieldNames {
    var v interface{}
    scanResults = append(scanResults, &v)
}

// Parsing the query results into the result map
for queryResponse.Next() {

    // This variable will hold the value for all the columns, named by the column name
    rowValues := map[string]interface{}{}

    // Cleaning up old values just in case
    for _, column := range scanResults {
        *(column.(*interface{})) = nil
    }

    // Scan into the array of pointers
    err := queryResponse.Scan(scanResults...)
    if err != nil {
        return nil, err
    }

    // Map the pointers back to their value and the associated column name
    for index, column := range scanResults {
        rowValues[fieldNames[index]] = *(column.(*interface{}))
    }
    results = append(results, rowValues)
}

return results, nil

Upvotes: 0

AndreKR
AndreKR

Reputation: 33678

Not in the Go distribution itself, but there is the wonderful jmoiron/sqlx:

import "github.com/jmoiron/sqlx"

tableData := make([]map[string]interface{}, 0)

for rows.Next() {
  entry := make(map[string]interface{})

  err := rows.MapScan(entry)

  if err != nil {
    log.Fatal("SQL error: " + err.Error())
  }

  tableData = append(tableData, entry)
}

Upvotes: 2

Shadow1349
Shadow1349

Reputation: 771

One thing you could do is create a struct that models your data.

**Note: I am using MS SQLServer

So lets say you want to get a user

type User struct {
    ID int `json:"id,omitempty"`
    UserName string `json:"user_name,omitempty"`
    ...
}

then you can do this

func GetUser(w http.ResponseWriter, req *http.Request) {
    var r Role
    params := mux.Vars(req)

    db, err := sql.Open("mssql", "server=ServerName")
    if err != nil {
        log.Fatal(err)
    }

    err1 := db.QueryRow("select Id, UserName from [Your Datavse].dbo.Users where Id = ?", params["id"]).Scan(&r.ID, &r.Name)
    if err1 != nil {
        log.Fatal(err1)
    }
    json.NewEncoder(w).Encode(&r)

    if err != nil {
        log.Fatal(err)
    }

}

Here are the imports I used

import (
    "database/sql"
    "net/http"

    "log"

    "encoding/json"

    _ "github.com/denisenkom/go-mssqldb"
    "github.com/gorilla/mux"
)

This allowed me to get data from the database and get it into JSON.

This takes a while to code, but it works really well.

Upvotes: 2

Sergei G
Sergei G

Reputation: 1700

If you know the data type that you are reading, then you can read into the data type without using generic interface.

Otherwise, there is no solution regardless of the language used due to nature of JSON itself.

JSON does not have description of composite data structures. In other words, JSON is a generic key-value structure. When parser encounters what is supposed to be a specific structure there is no identification of that structure in JSON itself. For example, if you have a structure User the parser would not know how a set of key-value pairs maps to your structure User.

The problem of type recognition is usually addressed with document schema (a.k.a. XSD in XML world) or explicitly through passed expected data type.

Upvotes: 1

Eddy K
Eddy K

Reputation: 216

I'm dealing with the same issue, as far as my investigation goes it looks that there is no other way.

All the packages that I have seen use basically the same method

Few things you should know, hopefully will save you time:

  • database/sql package converts all the data to the appropriate types
  • if you are using the mysql driver(go-sql-driver/mysql) you need to add params to your db string for it to return type time instead of a string (use ?parseTime=true, default is false)

You can use tools that were written by the community, to offload the overhead:

  • A minimalistic wrapper around database/sql, sqlx, uses similar way internally with reflection.

  • If you need more functionality, try using an "orm": gorp, gorm.

If you interested in diving deeper check out:

  • Using reflection in sqlx package, sqlx.go line 560
  • Data type conversion in database/sql package, convert.go line 86

Upvotes: 7

Related Questions