You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importsnowflake.connectorimportloggingimportpandasaspdimportloggingimportosforlogger_namein ('snowflake.connector',):
logger=logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG)
ch=logging.StreamHandler()
ch.setLevel(logging.DEBUG)
ch.setFormatter(logging.Formatter('%(asctime)s - %(threadName)s %(filename)s:%(lineno)d - %(funcName)s() - %(levelname)s - %(message)s'))
logger.addHandler(ch)
# Connect to Snowflakeconn=snowflake.connector.connect(
user="***",
password='',
account='***',
warehouse='***',
authenticator='externalbrowser'
)
# Create a cursor objectcursor=conn.cursor()
# Execute the querycursor.execute("SELECT * FROM my_table")
# Fetch all the results into a pandas DataFramedf=cursor.fetch_pandas_all()
# Close the cursor and connectioncursor.close()
conn.close()
# Print the DataFramedf.head()
What did you expect to see?
I have a relatively large Snowflake table (4,155,216 rows and 177 columns).
I want to pull the entire table into a Pandas dataframe.
From the Snowflake UI, I can successfully do
SELECT*FROM my_table;
When running the same query from a Jupyter notebook (see above), I was expecting the df to contain the data from the table.
Instead, the script runs for a bit and then hangs. The Python kernel dies and needs to be restarted.
I get the same error with anything but the smallest sample from that table. For example, a LIMIT 1000 works fine, but a LIMIT 10000 runs into the same issue.
I attach the debug logs from the code above (without the initial part of the logs to remove confidential information).
github-actionsbot
changed the title
Querying table works from Snowflake UI, but _fetch_pandas_all hangs and crashes kernel
SNOW-1556467: Querying table works from Snowflake UI, but _fetch_pandas_all hangs and crashes kernel
Jul 25, 2024
Python version
Python 3.10.2 (main, Apr 4 2022, 11:53:00) [Clang 13.1.6 (clang-1316.0.21.2)]
Operating system and processor architecture
macOS-14.5-arm64-arm-64bit
Installed packages
What did you do?
What did you expect to see?
I have a relatively large Snowflake table (4,155,216 rows and 177 columns).
I want to pull the entire table into a Pandas dataframe.
From the Snowflake UI, I can successfully do
When running the same query from a Jupyter notebook (see above), I was expecting the
df
to contain the data from the table.Instead, the script runs for a bit and then hangs. The Python kernel dies and needs to be restarted.
I get the same error with anything but the smallest sample from that table. For example, a
LIMIT 1000
works fine, but aLIMIT 10000
runs into the same issue.I attach the debug logs from the code above (without the initial part of the logs to remove confidential information).
connector_logs2.txt
Can you set logging to DEBUG and collect the logs?
The text was updated successfully, but these errors were encountered: