Vzzarr
Vzzarr

Reputation: 5700

pyWriteDynamicFrame: Unrecognized scheme null; expected s3, s3n, or s3a [Glue to Redshift]

While executing a Glue Job, after the necessaries transformations I am writing the results of my Spark df to a Redshift table like this:

dynamic_df = DynamicFrame.fromDF(df, glue_context, "dynamic_df")

glue_context.write_dynamic_frame.from_jdbc_conf(
    frame=dynamic_df, catalog_connection=args['catalog_connection'],
    connection_options={"dbtable": args['dbschema'] + "." + args['dbtable'], "database": args['database']},
    transformation_ctx="write_my_df")

But I am receiving this exception:

19/08/23 14:29:31 ERROR __main__: Traceback (most recent call last):
File "/mnt/yarn/usercache/root/appcache/application_1572375324962_0001/container_1572375324962_0001_01_000001/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/mnt/yarn/usercache/root/appcache/application_1572375324962_0001/container_1572375324962_0001_01_000001/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o190.pyWriteDynamicFrame.
: java.lang.IllegalArgumentException: Unrecognized scheme null; expected s3, s3n, or s3a

What am I doing wrong? How can I solve it?

Upvotes: 2

Views: 5381

Answers (1)

Vzzarr
Vzzarr

Reputation: 5700

I was missing the parameter redshift_tmp_dir in the function from_jdbc_conf as reported in documentation.

So the functions now is:

glue_context.write_dynamic_frame.from_jdbc_conf(
    frame=dynamic_df, catalog_connection=args['catalog_connection'],
    connection_options={"dbtable": args['dbschema'] + "." + args['dbtable'], "database": args['database']},
    redshift_tmp_dir="s3://my_bucket/my/location/", transformation_ctx="write_my_df")

Upvotes: 6

Related Questions