We have hundreds of dbt models in production and have never experienced this before, until tonight.
A standard table model was created, and then the data was written to it twice.
We noticed this thanks to a unique data-test.
Checking the table history in Unity Catalog, I can se the replace operation followed by two writes.

Checking the query history, I can see the Replace operation followed by a CANCELED insert, then a FINISHED insert.
Something has gone horrible wrong here, since both inserts has materialized.
Maybe this is not a dbt issue? But a transactional databricks sql issue?
Has anyone experienced this?
Has it been addressed in any dbt update or dependent library?
Any ways to mitigate this?
Since the dbt logs was generated on a temporary cluster, I do not have access to them and can not provide them here.
use_materialization_v2: true
"dbt_version": "1.10.8"
"dbt_databricks_version": "1.10.9"
"databricks_sql_connector_version": "4.2.4"
Databricks-Runtime/17.3.x-aarch64-photon-scala2.13
We have hundreds of dbt models in production and have never experienced this before, until tonight.
A standard table model was created, and then the data was written to it twice.
We noticed this thanks to a unique data-test.
Checking the table history in Unity Catalog, I can se the replace operation followed by two writes.
Maybe this is not a dbt issue? But a transactional databricks sql issue?
Has anyone experienced this?
Has it been addressed in any dbt update or dependent library?
Any ways to mitigate this?
Since the dbt logs was generated on a temporary cluster, I do not have access to them and can not provide them here.
use_materialization_v2: true
"dbt_version": "1.10.8"
"dbt_databricks_version": "1.10.9"
"databricks_sql_connector_version": "4.2.4"
Databricks-Runtime/17.3.x-aarch64-photon-scala2.13