ozn
ozn

Reputation: 2238

import data into PostgreSQL and export to CSV

I am trying to do csv import and export data with postgresql (where the data base is at a remote host). Normally I would use the psql command to do \copy <table> from <local path> ... and \copy <table> to <local path> ... but I need to be able to do it via Go where I don't have access to shell or systems that don't have psql installed.

The data itself is expected to be pretty light (maybe < 2 MB of data together), hence I am trying not to implement any structs/schema of track the columns in the tables. When importing into DB, I want to library/code to infer the schema of the table and push the data to the tables.

Any suggestions on how to implement this? I am not sure if any of the Go database/sql or pgx or pq allow this without being able to specify columns. Any advice on this this?

Edit:

I ended up using https://github.com/joho/sqltocsv for DB export, which is pretty simple enough where I don't have to define any schema/structs.

I don't have the code but I tried gorm and realized I need to define some struct/schema for it.

Upvotes: 1

Views: 3110

Answers (2)

ozn
ozn

Reputation: 2238

I found way to do it with pgx package (thanks to @Gustavo Kawamoto suggestion). Here's my import and export:

package main

import (
    "fmt"
    "os"

    "github.com/jackc/pgx"
)

func main() {
    pgxConConfig := pgx.ConnConfig{
        Port:     5432,
        Host:     "remote_host",
        Database: "db_name",
        User:     "my_user",
        Password: "my_password",
    }

    conn, err := pgx.Connect(pgxConConfig)
    if err != nil {
        panic(err)
    }
    defer conn.Close()

    tables := []string{"table1", "table2", "table3",}

    import_dir := "/dir_to_import_from"
    export_dir := "/dir_to_export_to"
    
    for _, t := range tables {
        f, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", import_dir, t), os.O_RDONLY, 0777)
        if err != nil {
            return
        }
        f.Close()

        err = importer(conn, f, t)
        if err != nil {
            break
        }

        fmt.Println("  Done with import and doing export")
        ef, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", export_dir, t), os.O_CREATE|os.O_WRONLY, 0777)
        if err != nil {
            fmt.Println("error opening file:", err)
            return
        }
        ef.Close()

        err = exporter(conn, ef, t)
        if err != nil {
            break
        }
    }
}

func importer(conn *pgx.Conn, f *os.File, table string) error {
    res, err := conn.CopyFromReader(f, fmt.Sprintf("COPY %s FROM STDIN DELIMITER '|' CSV HEADER", table))
    if err != nil {
        return err
    }
    fmt.Println("==> import rows affected:", res.RowsAffected())

    return nil
}

func exporter(conn *pgx.Conn, f *os.File, table string) error {
    res, err := conn.CopyToWriter(f, fmt.Sprintf("COPY %s TO STDOUT DELIMITER '|' CSV HEADER", table))
    if err != nil {
        return fmt.Errorf("error exporting file: %+v", err)
    }
    fmt.Println("==> export rows affected:", res.RowsAffected())
    return nil
}

Upvotes: 3

twiny
twiny

Reputation: 312

Checkout this pkg: https://github.com/chop-dbhi/sql-importer

  1. Automatic table creation
  2. Uniqueness and not null detection.
  3. Support for CSV files wider than 1600 columns (the Postgres limit)

Upvotes: 2

Related Questions