From 5e32965b5e4281cd24376d34addb876cee01b029 Mon Sep 17 00:00:00 2001 From: FAQ Bot Date: Sun, 8 Mar 2026 01:33:07 +0000 Subject: [PATCH] NEW: Why does casting TIMESTAMP_NTZ to BIGINT fail in Spark, and how can I co --- ...c2c5634e36_cast-timestamp-ntz-to-bigint-spark.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 _questions/data-engineering-zoomcamp/module-6/062_c2c5634e36_cast-timestamp-ntz-to-bigint-spark.md diff --git a/_questions/data-engineering-zoomcamp/module-6/062_c2c5634e36_cast-timestamp-ntz-to-bigint-spark.md b/_questions/data-engineering-zoomcamp/module-6/062_c2c5634e36_cast-timestamp-ntz-to-bigint-spark.md new file mode 100644 index 0000000..c7f6bf9 --- /dev/null +++ b/_questions/data-engineering-zoomcamp/module-6/062_c2c5634e36_cast-timestamp-ntz-to-bigint-spark.md @@ -0,0 +1,13 @@ +--- +id: c2c5634e36 +question: Why does casting TIMESTAMP_NTZ to BIGINT fail in Spark, and how can I convert + it to a numeric value? +sort_order: 62 +--- + +TIMESTAMP_NTZ cannot be cast directly to numeric types like BIGINT in Spark. To convert to a numeric representation (epoch seconds), use the to_unix_timestamp function. + +```sql +SELECT to_unix_timestamp(tpep_pickup_datetime) +FROM yellow_2025_11 +``` \ No newline at end of file