From d18c45cefd90dd34e68d56fe779366e3bae36bda Mon Sep 17 00:00:00 2001 From: FAQ Bot Date: Mon, 23 Mar 2026 19:54:16 +0000 Subject: [PATCH] NEW: How can I sync data from PostgreSQL to BigQuery for analytical workloads --- ..._917b0b0fb5_postgresql-to-bigquery-sync.md | 31 +++++++++++++++++++ 1 file changed, 31 insertions(+) create mode 100644 _questions/data-engineering-zoomcamp/module-3/025_917b0b0fb5_postgresql-to-bigquery-sync.md diff --git a/_questions/data-engineering-zoomcamp/module-3/025_917b0b0fb5_postgresql-to-bigquery-sync.md b/_questions/data-engineering-zoomcamp/module-3/025_917b0b0fb5_postgresql-to-bigquery-sync.md new file mode 100644 index 0000000..7acb90e --- /dev/null +++ b/_questions/data-engineering-zoomcamp/module-3/025_917b0b0fb5_postgresql-to-bigquery-sync.md @@ -0,0 +1,31 @@ +--- +id: 917b0b0fb5 +question: How can I sync data from PostgreSQL to BigQuery for analytical workloads? +sort_order: 25 +--- + +Overview: +You can sync data from PostgreSQL to BigQuery by extracting tables and loading them into BigQuery datasets. + +Steps: +1. Connect to PostgreSQL using a Python script +2. Read tables into pandas DataFrames +3. Use the Google Cloud BigQuery client to load data + +```python +from google.cloud import bigquery + +client = bigquery.Client() +table_id = 'project.dataset.table' + +# Assuming df is a pandas DataFrame containing your data +job = client.load_table_from_dataframe(df, table_id) +job.result() +``` + +Prerequisites: +- your service account credentials are set +- the dataset exists in BigQuery +- location (US or EU) matches during queries + +This approach lets you keep ingestion local while using BigQuery for scalable analytics. \ No newline at end of file