Pirama
Pirama

Reputation: 11

Write data available in GCS file into cloud sql using dataflow JdbcIO api

How to write data available in GCS file i.e "gs://***/sampleBigtable.csv" to RDBMS DB. Could you please share some example.

pipeline
        .apply(TextIO.read().from("gs://***/sampleBigtable.csv"))
        .apply(JdbcIO.<String>write().withDataSourceConfiguration(
            DataSourceConfiguration.create("org.postgresql.Driver","jdbc:postgresql://***:5432/test")
                .withUsername("**")
                .withPassword("password10"))
                .withStatement("insert into person values(?,?)")
                    .withPreparedStatementSetter((element, query) -> {
                        query.setInt(1, 1);
                        query.setString(2, "Hello");
                    })
        );

Upvotes: 0

Views: 2442

Answers (1)

Emil Gi
Emil Gi

Reputation: 1168

I understand from your question that you don't need to enrich the data in file or manipulate it in other ways during transfer. In this case simpler solution would be to just use one of Cloud Sql csv import options.

Make sure that the file is in valid csv format, create table in db, allow Cloud Sql service account access to the bucket and run following command in Cloud Shell:

gcloud sql import csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \ --database=[DATABASE_NAME] --table=[TABLE_NAME]

Upvotes: 1

Related Questions