Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,8 @@ def _loader(filename: str):
def _enable_fallback_compute():
"""Enable serverless compute if no compute is specified."""
conf = WorkspaceClient().config
if conf.serverless_compute_id or conf.cluster_id or os.environ.get("SPARK_REMOTE"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I worry this line is pretty surprising to maintainers, since most users will run with the SDK referenced in pyproject.toml. Can you add a comment that this is for older Databricks SDK versions?

has_serverles_compute_id = hasattr(conf, "serverless_compute_id") and conf.serverless_compute_id
if has_serverles_compute_id or conf.cluster_id or os.environ.get("SPARK_REMOTE"):
Comment on lines +62 to +63
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
has_serverles_compute_id = hasattr(conf, "serverless_compute_id") and conf.serverless_compute_id
if has_serverles_compute_id or conf.cluster_id or os.environ.get("SPARK_REMOTE"):
has_serverless_compute_id = hasattr(conf, "serverless_compute_id") and conf.serverless_compute_id
if has_serverless_compute_id or conf.cluster_id or os.environ.get("SPARK_REMOTE"):

return

url = "https://docs.databricks.com/dev-tools/databricks-connect/cluster-config"
Expand All @@ -83,6 +84,8 @@ def _allow_stderr_output(config: pytest.Config):
def pytest_configure(config: pytest.Config):
"""Configure pytest session."""
with _allow_stderr_output(config):
src_path = pathlib.Path(__file__).parent.parent / "src"
sys.path.insert(0, str(src_path))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line doesn't seem right to me, could you look at alternative options for your test runner instead? Normally imports in tests just work out of the box, as seen at https://github.com/databricks/bundle-examples/blob/333bab00d7dc17aa3d2491f3a9d1883590984793/default_python/tests/sample_taxis_test.py#L3. There must be something wrong with your environment, maybe that can be fixed in some other way?

_enable_fallback_compute()

# Initialize Spark session eagerly, so it is available even when
Expand Down