Skip to content

Commit a0580e4

Browse files
authored
Switch default-python to use pyproject.toml & uv & dynamic_version in dev [hatchling backend] (#3042)
## Changes - Update default-python template to use pyproject.toml and uv (with hatchling). - Get rid of dynamic version in setup.py and using dynamic_version attribute on artifacts instead (only in dev target and only on non-serverless clusters since serverless has fixed the issue with versions not being updated in the platform). See #2427 #2520 ## Why Modern Python is based around pyproject.toml, not setup.py. Using DAB-provided dynamic_version attribute allows keeping pyproject.toml simple and does not constraint what build backend users can choose. Note, I've also considered using uv_build backend #3004 but it seems too early at the moment and hatchling is better tested with DABs. ## Tests Existing tests.
1 parent 98e1449 commit a0580e4

27 files changed

Lines changed: 240 additions & 258 deletions

File tree

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
## Release v0.258.0
44

55
### Notable Changes
6+
* Switch default-python template to use pyproject.toml + dynamic\_version in dev target. uv is now required. ([#3042](https://github.com/databricks/cli/pull/3042))
67

78
### Dependency updates
89
* Upgraded TF provider to 1.84.0 ([#3151](https://github.com/databricks/cli/pull/3151))

acceptance/bundle/templates/default-python/classic/out.compare-vs-serverless.diff

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,17 @@
1+
--- [TESTROOT]/bundle/templates/default-python/classic/../serverless/output/my_default_python/databricks.yml
2+
+++ output/my_default_python/databricks.yml
3+
@@ -25,4 +25,11 @@
4+
host: [DATABRICKS_URL]
5+
6+
+ presets:
7+
+ # Set dynamic_version: true on all artifacts of type "whl".
8+
+ # This makes "bundle deploy" add a timestamp to wheel's version before uploading,
9+
+ # new wheel takes over the previous installation even if actual wheel version is unchanged.
10+
+ # See https://docs.databricks.com/aws/en/dev-tools/bundles/settings
11+
+ artifacts_dynamic_version: true
12+
+
13+
prod:
14+
mode: production
115
--- [TESTROOT]/bundle/templates/default-python/classic/../serverless/output/my_default_python/resources/my_default_python.job.yml
216
+++ output/my_default_python/resources/my_default_python.job.yml
317
@@ -17,4 +17,5 @@

acceptance/bundle/templates/default-python/classic/output/my_default_python/README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@ The 'my_default_python' project was generated by using the default-python templa
44

55
## Getting started
66

7+
0. Install UV: https://docs.astral.sh/uv/getting-started/installation/
8+
79
1. Install the Databricks CLI from https://docs.databricks.com/dev-tools/cli/databricks-cli.html
810

911
2. Authenticate to your Databricks workspace, if you have not done so already:

acceptance/bundle/templates/default-python/classic/output/my_default_python/databricks.yml

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,11 @@ bundle:
44
name: my_default_python
55
uuid: [UUID]
66

7+
artifacts:
8+
python_artifact:
9+
type: whl
10+
build: uv build --wheel
11+
712
include:
813
- resources/*.yml
914
- resources/*/*.yml
@@ -19,6 +24,13 @@ targets:
1924
workspace:
2025
host: [DATABRICKS_URL]
2126

27+
presets:
28+
# Set dynamic_version: true on all artifacts of type "whl".
29+
# This makes "bundle deploy" add a timestamp to wheel's version before uploading,
30+
# new wheel takes over the previous installation even if actual wheel version is unchanged.
31+
# See https://docs.databricks.com/aws/en/dev-tools/bundles/settings
32+
artifacts_dynamic_version: true
33+
2234
prod:
2335
mode: production
2436
workspace:
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
[project]
2+
name = "my_default_python"
3+
version = "0.0.1"
4+
authors = [{ name = "[USERNAME]" }]
5+
requires-python = ">= 3.11"
6+
7+
[project.optional-dependencies]
8+
dev = [
9+
"pytest",
10+
11+
# Code completion support for DLT, also install databricks-connect
12+
"databricks-dlt",
13+
14+
# databricks-connect can be used to run parts of this project locally.
15+
# See https://docs.databricks.com/dev-tools/databricks-connect.html.
16+
#
17+
# Note, databricks-connect is automatically installed if you're using Databricks
18+
# extension for Visual Studio Code
19+
# (https://docs.databricks.com/dev-tools/vscode-ext/dev-tasks/databricks-connect.html).
20+
#
21+
# To manually install databricks-connect, uncomment the line below to install a version
22+
# of db-connect that corresponds to the Databricks Runtime version used for this project.
23+
# See https://docs.databricks.com/dev-tools/databricks-connect.html
24+
# "databricks-connect>=15.4,<15.5",
25+
]
26+
27+
[tool.pytest.ini_options]
28+
pythonpath = "src"
29+
testpaths = [
30+
"tests",
31+
]
32+
33+
[build-system]
34+
requires = ["hatchling"]
35+
build-backend = "hatchling.build"
36+
37+
[tool.hatch.build.targets.wheel]
38+
packages = ["src/my_default_python"]
39+
40+
[project.scripts]
41+
main = "my_default_python.main:main"

acceptance/bundle/templates/default-python/classic/output/my_default_python/pytest.ini

Lines changed: 0 additions & 3 deletions
This file was deleted.

acceptance/bundle/templates/default-python/classic/output/my_default_python/requirements-dev.txt

Lines changed: 0 additions & 29 deletions
This file was deleted.

acceptance/bundle/templates/default-python/classic/output/my_default_python/setup.py

Lines changed: 0 additions & 41 deletions
This file was deleted.
Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +0,0 @@
1-
__version__ = "0.0.1"

acceptance/bundle/templates/default-python/combinations/check_output.py

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
#!/usr/bin/env python3
22
import sys
33
import os
4+
import re
45
import subprocess
56

67
SERVERLESS = os.environ["SERVERLESS"] == "yes"
@@ -11,7 +12,7 @@
1112
sys.exit(f"SKIP_TEST SERVERLESS=yes but TEST_METASTORE_ID is empty in this env {CLOUD_ENV=}")
1213

1314
BUILDING = "Building python_artifact"
14-
UPLOADING = "Uploading dist/"
15+
UPLOADING_WHL = re.compile("Uploading .*whl")
1516
STATE = "Updating deployment state"
1617

1718

@@ -21,7 +22,7 @@ def is_printable_line(line):
2122
return False
2223

2324
# only shown when include_python=yes
24-
if line.startswith(UPLOADING):
25+
if UPLOADING_WHL.match(line):
2526
return False
2627

2728
# not shown when all settings are equal to "no"
@@ -33,19 +34,19 @@ def is_printable_line(line):
3334

3435
p = subprocess.run(sys.argv[1:], stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding="utf-8")
3536
try:
36-
assert p.returncode == 0
37+
assert p.returncode == 0, p.returncode
3738
assert p.stdout == ""
39+
if INCLUDE_PYTHON:
40+
assert BUILDING in p.stderr, BUILDING
41+
assert UPLOADING_WHL.search(p.stderr), UPLOADING_WHL
42+
else:
43+
assert BUILDING not in p.stderr, BUILDING
44+
assert not UPLOADING_WHL.search(p.stderr), UPLOADING_WHL
45+
3846
for line in p.stderr.strip().split("\n"):
3947
if is_printable_line(line):
4048
print(line.strip())
4149

42-
if INCLUDE_PYTHON:
43-
assert BUILDING in p.stderr
44-
assert UPLOADING in p.stderr
45-
else:
46-
assert BUILDING not in p.stderr
47-
assert UPLOADING not in p.stderr
48-
4950
except:
5051
print(f"STDOUT: {len(p.stdout)} chars")
5152
if p.stdout:

0 commit comments

Comments
 (0)