Reputation: 997
So I have a JSON File in the format...
[
{
"Key":"Value",
"Key2":"Value2",
"Key3":"Value3"
},
{
"Foo":"Bar",
"Blah":2
}
]
I want to just read in the hash parts of it and pass them to an HTTP request like in goRequest, because goRequest is fine with just the JSON being in a String.
package main
request := gorequest.New()
resp, body, errs := request.Post("http://example.com").
Set("Notes","gorequst is coming!").
Send(`{"Foo":"Bar","Blah":2}`).
End()
I don't care what the JSON is and I don't need to unmarshal it to any go Structs or anything of the sort, it's fine just remaining as a string and being totally untouched, just passed along to the request.
I've seen a lot online about it, but it always seems to wanna un-marshal the JSON to Go Structs and the sort, which is fine if you want to care about what actually is in the JSON, but in my case this seems like unnecessary overhead.
How would I accomplish something like this? It seems pretty simple, but none of the existing JSON libraries for Go seem to be able to accomplish this.
Thanks.
Upvotes: 1
Views: 1577
Reputation: 997
Here is what I was thinking as a solution, does this look sane?
package main
import (
"encoding/csv"
"fmt"
"os"
"bytes"
"flag"
"github.com/parnurzeal/gorequest"
)
func process_line(headers []string, line []string) {
var comma string = ""
var buffer bytes.Buffer
buffer.WriteString("[{")
for i := range headers {
buffer.WriteString(fmt.Sprintf("%s\"%s\":\"%s\"", comma, headers[i], line[i]))
comma = ","
}
fmt.Fprintf(&buffer,"}]\n")
request := gorequest.New()
resp, body, errs := request.Post("www.something.com").
Set("Content-Type", "application/json").
Set("Accept", "application/json").
Send(buffer.String()).End()
if errs == nil {
return resp
}else{
fmt.Println(errs)
}
}
func main() {
file := flag.String("file", "", "Filename?")
flag.Parse()
if *file == "" {
fmt.Println("No file specified. :-(")
os.Exit(1)
}
csvFile, err := os.Open(*file)
if err != nil {
fmt.Println(err)
}
defer csvFile.Close()
reader := csv.NewReader(csvFile)
var i int = 0
var headers []string
for {
line, err := reader.Read()
if err != nil {
break
}
if i == 0 {
headers = line
}else{
go process_line(headers, line)
}
if i%100 == 0 {
fmt.Printf("%v records processed.\n", i)
}
i += 1
}
}
Upvotes: 0
Reputation: 12033
You are probably looking for json.RawMessage.
For example:
package main
import (
"encoding/json"
"fmt"
"log"
)
func main() {
txt := []byte(`
[
{"key1" : "value1" },
{"key2" : "value2" }
]`)
msg := []json.RawMessage{}
err := json.Unmarshal(txt, &msg)
if err != nil {
log.Fatal(err)
}
for _, c := range msg {
fmt.Printf("%s\n", string(c))
}
}
Note that the redundant white space in the example separating the key/value pairs is intentional: you will see that these are preserved in the output.
Alternatively, even if you don't care about the exact structure, you can still dynamically poke at it by using an interface{}
variable. See the JSON and Go document for a running example of this, under the Generic JSON with interface{} section.
If we are trying to do something like a streaming approach, we may attempt to do something custom with the io.Reader
. The JSON
parser assumes you can represent everything in memory at once. That assumption may not hold in your situation, so we have to break a few things.
Perhaps we might manually consume bytes in the io.Reader
till we eat the leading [
character, and then repeatedly call json.Decode
on the rest of the io.Reader
. Something like this:
package main
import (
"bytes"
"encoding/json"
"fmt"
"io"
"log"
)
func main() {
var txt io.Reader = bytes.NewBufferString(`
[
{"key1" : "value1" },
{"key2" : "value2" }
]`)
buf := make([]byte, 1)
for {
_, err := txt.Read(buf)
if err != nil {
log.Fatal(err)
}
if buf[0] == '[' {
break
}
}
for {
decoder := json.NewDecoder(txt)
msg := json.RawMessage{}
err := decoder.Decode(&msg)
if err != nil {
break
}
fmt.Printf("I see: %s\n", string(msg))
txt = decoder.Buffered()
for {
_, err := txt.Read(buf)
if err != nil {
log.Fatal(err)
}
if buf[0] == ',' || buf[0] == ']' {
break
}
}
}
}
This code is severely kludgy and non-obvious. I also don't think it's a good idea. If you have to deal with this in a streaming fashion, then JSON is likely not a good serialization format for this scenario. If you have control over the input, then you should consider changing it so it's more amendable to a streaming approach: hacks like what we're doing here are a bad smell that the input is in the wrong shape.
Upvotes: 3