Skip to content
This repository has been archived by the owner on Sep 26, 2023. It is now read-only.

Commit

Permalink
Add documentation for writing polars dataframes to bigquery (#391)
Browse files Browse the repository at this point in the history
Co-authored-by: Stijn de Gooijer <[email protected]>
  • Loading branch information
noam-delfina and stinodego authored Sep 13, 2023
1 parent 053f409 commit 8193e78
Show file tree
Hide file tree
Showing 2 changed files with 21 additions and 1 deletion.
20 changes: 20 additions & 0 deletions docs/src/python/user-guide/io/bigquery.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,24 @@
df = pl.from_arrow(rows.to_arrow())
# --8<-- [end:read]
# --8<-- [start:write]
from google.cloud import bigquery
client = bigquery.Client()
# Write dataframe to stream as parquet file; does not hit disk
with io.BytesIO() as stream:
df.write_parquet(stream)
stream.seek(0)
job = client.load_table_from_file(
stream,
destination='tablename',
project='projectname',
job_config=bigquery.LoadJobConfig(
source_format=bigquery.SourceFormat.PARQUET,
),
)
job.result() # Waits for the job to complete
# --8<-- [end:write]
"""
2 changes: 1 addition & 1 deletion docs/user-guide/io/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ We can load a query into a `DataFrame` like this:

## Write

--8<-- "docs/_build/snippets/under_construction.md"
{{code_block('user-guide/io/bigquery','write',[])}}

0 comments on commit 8193e78

Please sign in to comment.