Reputation: 31
When I extract the small dataset from Snowflake, it works with no issues. But when I try to extract the large dataset from Snowflake using Python SNOW connector, it throws an Operational Error.
Any help is appreciated.
Error: OperationalError: (snowflake.connector.errors.OperationalError) 250003: Failed to get the response. Hanging? method: get, url: https://sfc-oh-ds1-customer-stage.s3.amazonaws.com/cyok-s-ohss0400/results/0198120b-0077-b3d9-0000-0661009d1726_0/main/data_0_0_1?x-amz-server-side-encryption-customer-algorithm=AES256&response-content-encoding=gzip&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20201105T174742Z&X-Amz-SignedHeaders=host&X-Amz-Expires=86399&X-Amz-Credential=AKIAZ767M53OOUHERYED%2F20201105%2Fus-east-2%2Fs3%2Faws4_request&X-Amz-Signature=047fa367aeea73a6ef66de485a3d5dccc229707a72c21096b01b180fd3369c7d (Background on this error at: http://sqlalche.me/e/e3q8)
Code:
import snowflake.connector
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.dialects import registry
from snowflake.sqlalchemy import URL
import sqlalchemy
snow_conn1 = URL(
account="XXX.aws",
user="XXX",
password="XXX",
insecure_mode=True,
role="SYSADMIN",
warehouse=XX,
database=XX,
schema=XX)
engine1=create_engine(snow_conn1)
with engine1.connect() as con:
mtd_query1 = "select * from information_schema.tables"
df1 = pd.read_sql(mtd_query1, con)
Upvotes: 2
Views: 694
Reputation: 41
I recently ran into this same exact issue. At some point, when the data pull becomes too large, Snowflake will redirect the client to pull the data directly from S3. In this case, it seems like your client is able to reach the Snowflake API but not the bucket that is actually storing the data. You'll need to update your firewall settings or set up a proxy server in order to reach those S3 addresses.
Upvotes: 0