Skip to content

Commit 92c60e3

Browse files
Bump github.com/databricks/databricks-sdk-go from 0.65.0 to 0.67.0 (#2812)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.65.0 to 0.67.0. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>Release v0.67.0</h2> <h3>Bug Fixes</h3> <ul> <li>Fixed the deserialization of responses in VectorSearchAPI's <code>QueryIndex()</code> method (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/1214">#1214</a>).</li> </ul> <h3>API Changes</h3> <ul> <li>Added <code>FutureFeatureDataPath</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/ml#CreateForecastingExperimentRequest">ml.CreateForecastingExperimentRequest</a>.</li> <li>Added <code>ExcludeColumns</code> and <code>IncludeColumns</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig">pipelines.TableSpecificConfig</a>.</li> <li>Added <code>NetworkCheckControlPlaneFailure</code>, <code>NetworkCheckDnsServerFailure</code>, <code>NetworkCheckMetadataEndpointFailure</code>, <code>NetworkCheckMultipleComponentsFailure</code>, <code>NetworkCheckNicFailure</code>, <code>NetworkCheckStorageFailure</code> and <code>SecretPermissionDenied</code> enum values for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#TerminationReasonCode">compute.TerminationReasonCode</a>.</li> <li>[Breaking] Changed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ListValue">vectorsearch.ListValue</a> to.</li> <li>[Breaking] Changed <code>PipelineId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> to be required.</li> <li>Changed <code>ConnectionName</code>, <code>GatewayStorageCatalog</code> and <code>GatewayStorageSchema</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionGatewayPipelineDefinition">pipelines.IngestionGatewayPipelineDefinition</a> to be required.</li> <li>[Breaking] Changed <code>ConnectionName</code>, <code>GatewayStorageCatalog</code> and <code>GatewayStorageSchema</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionGatewayPipelineDefinition">pipelines.IngestionGatewayPipelineDefinition</a> to be required.</li> <li>[Breaking] Changed <code>Kind</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a> to be required.</li> <li>Changed <code>Kind</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a> to be required.</li> <li>[Breaking] Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceUrl</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ReportSpec">pipelines.ReportSpec</a> to be required.</li> <li>Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceUrl</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ReportSpec">pipelines.ReportSpec</a> to be required.</li> <li>[Breaking] Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceSchema</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a> to be required.</li> <li>Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceSchema</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a> to be required.</li> <li>Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceTable</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a> to be required.</li> <li>[Breaking] Changed <code>DestinationCatalog</code>, <code>DestinationSchema</code> and <code>SourceTable</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a> to be required.</li> <li>[Breaking] Changed pagination for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#AlertsV2API.ListAlerts">AlertsV2API.ListAlerts</a>.</li> <li>[Breaking] Changed waiter for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI.CreateMessage">GenieAPI.CreateMessage</a>.</li> </ul> <h2>Release v0.66.0</h2> <h3>Bug Fixes</h3> <ul> <li>Tolerate trailing slashes in hostnames in <code>Config</code>.</li> </ul> <h3>API Changes</h3> <ul> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#AlertsV2API">w.AlertsV2</a> workspace-level service.</li> <li>Added <code>UpdateNccAzurePrivateEndpointRulePublic</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityAPI">a.NetworkConnectivity</a> account-level service.</li> <li>Added <code>CreatedAt</code>, <code>CreatedBy</code> and <code>MetastoreId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SetArtifactAllowlist">catalog.SetArtifactAllowlist</a>.</li> <li>[Breaking] Added <code>NetworkConnectivityConfig</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreateNetworkConnectivityConfigRequest">settings.CreateNetworkConnectivityConfigRequest</a>.</li> <li>[Breaking] Added <code>PrivateEndpointRule</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequest">settings.CreatePrivateEndpointRuleRequest</a>.</li> <li>Added <code>DomainNames</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRule">settings.NccAzurePrivateEndpointRule</a>.</li> <li>Added <code>AutoResolveDisplayName</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateAlertRequest">sql.CreateAlertRequest</a>.</li> <li>Added <code>AutoResolveDisplayName</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateQueryRequest">sql.CreateQueryRequest</a>.</li> <li>Added <code>CreateCleanRoom</code>, <code>ExecuteCleanRoomTask</code> and <code>ModifyCleanRoom</code> enum values for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#Privilege">catalog.Privilege</a>.</li> <li>Added <code>DnsResolutionError</code> and <code>GcpDeniedByOrgPolicy</code> enum values for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#TerminationReasonCode">compute.TerminationReasonCode</a>.</li> <li>Added <code>Expired</code> enum value for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleConnectionState">settings.NccAzurePrivateEndpointRuleConnectionState</a>.</li> <li>[Breaking] Changed <code>CreateNetworkConnectivityConfiguration</code> and <code>CreatePrivateEndpointRule</code> methods for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityAPI">a.NetworkConnectivity</a> account-level service with new required argument order.</li> <li>[Breaking] Changed <code>WorkloadSize</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a> to type <code>string</code>.</li> <li>[Breaking] Changed <code>GroupId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRule">settings.NccAzurePrivateEndpointRule</a> to type <code>string</code>.</li> <li>[Breaking] Changed <code>TargetServices</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzureServiceEndpointRule">settings.NccAzureServiceEndpointRule</a> to type <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#EgressResourceTypeList">settings.EgressResourceTypeList</a>.</li> <li>[Breaking] Removed <code>Name</code> and <code>Region</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreateNetworkConnectivityConfigRequest">settings.CreateNetworkConnectivityConfigRequest</a>.</li> <li>[Breaking] Removed <code>GroupId</code> and <code>ResourceId</code> fields for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequest">settings.CreatePrivateEndpointRuleRequest</a>.</li> <li>[Breaking] Removed <code>Large</code>, <code>Medium</code> and <code>Small</code> enum values for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInputWorkloadSize">serving.ServedModelInputWorkloadSize</a>.</li> <li>[Breaking] Removed <code>Blob</code>, <code>Dfs</code>, <code>MysqlServer</code> and <code>SqlServer</code> enum values for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleGroupId">settings.NccAzurePrivateEndpointRuleGroupId</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/13af0878b6d34ffa07009b02050b124d3217f4d1"><code>13af087</code></a> [Release] Release v0.67.0</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/c7796ceca71d5fc91ff6ead4066fe43055619e89"><code>c7796ce</code></a> Bump API Specification to 2 May 2025 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1214">#1214</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/4e775146dc0cb65a9e5621040ba38e0ffd91143c"><code>4e77514</code></a> [Release] Release v0.66.0</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/cedae03775b56d60b96f5d4415323ae2a2c340db"><code>cedae03</code></a> Bump API specification to 30 Apr 2025 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1212">#1212</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-go/commit/eab964f3172844ba6280b2eb57b5562a16bf16fd"><code>eab964f</code></a> Tolerate trailing slash (<a href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1211">#1211</a>)</li> <li>See full diff in <a href="https://github.com/databricks/databricks-sdk-go/compare/v0.65.0...v0.67.0">compare view</a></li> </ul> </details> <br /> <details> <summary>Most Recent Ignore Conditions Applied to This Pull Request</summary> | Dependency Name | Ignore Conditions | | --- | --- | | github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] | </details> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.65.0&new-version=0.67.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
1 parent 132181c commit 92c60e3

29 files changed

Lines changed: 680 additions & 153 deletions

File tree

.codegen.json

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,17 +10,19 @@
1010
".codegen/cmds-account.go.tmpl": "cmd/account/cmd.go"
1111
},
1212
"version": {
13-
"experimental/python/README.md": "version $VERSION or above",
14-
"experimental/python/databricks/bundles/version.py": "__version__ = \"$VERSION\"",
15-
"experimental/python/pyproject.toml": "version = \"$VERSION\"",
16-
"experimental/python/uv.lock": "name = \"databricks-bundles\"\nversion = \"$VERSION\"",
17-
"libs/template/templates/experimental-jobs-as-code/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}"
13+
"experimental/python/README.md": "version $VERSION or above",
14+
"experimental/python/databricks/bundles/version.py": "__version__ = \"$VERSION\"",
15+
"experimental/python/pyproject.toml": "version = \"$VERSION\"",
16+
"experimental/python/uv.lock": "name = \"databricks-bundles\"\nversion = \"$VERSION\"",
17+
"libs/template/templates/experimental-jobs-as-code/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}"
1818
},
1919
"toolchain": {
2020
"required": [
2121
"go"
2222
],
2323
"post_generate": [
24+
"[ ! -f tagging.py ] || mv tagging.py internal/genkit/tagging.py",
25+
"rm .github/workflows/next-changelog.yml",
2426
"go test -timeout 240s -run TestConsistentDatabricksSdkVersion github.com/databricks/cli/internal/build",
2527
"make schema",
2628
"echo 'bundle/internal/tf/schema/\\*.go linguist-generated=true' >> ./.gitattributes",

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
06a18b97d7996d6cd8dd88bfdb0f2c2792739e46
1+
d4c86c045ee9d0410a41ef07e8ae708673b95fa1

.codegen/service.go.tmpl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ func new{{.PascalName}}() *cobra.Command {
128128

129129
var {{.CamelName}}Req {{.Service.Package.Name}}.{{.Request.PascalName}}
130130
{{- if .RequestBodyField }}
131-
{{.CamelName}}Req.{{.RequestBodyField.PascalName}} = &{{.Service.Package.Name}}.{{.RequestBodyField.Entity.PascalName}}{}
131+
{{.CamelName}}Req.{{.RequestBodyField.PascalName}} = {{ if .RequestBodyField.IsOptionalObject }}&{{end}}{{.Service.Package.Name}}.{{.RequestBodyField.Entity.PascalName}}{}
132132
{{- end }}
133133
{{- if $canUseJson}}
134134
var {{.CamelName}}Json flags.JsonFlag

.gitattributes

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ cmd/workspace/access-control/access-control.go linguist-generated=true
3737
cmd/workspace/aibi-dashboard-embedding-access-policy/aibi-dashboard-embedding-access-policy.go linguist-generated=true
3838
cmd/workspace/aibi-dashboard-embedding-approved-domains/aibi-dashboard-embedding-approved-domains.go linguist-generated=true
3939
cmd/workspace/alerts-legacy/alerts-legacy.go linguist-generated=true
40+
cmd/workspace/alerts-v2/alerts-v2.go linguist-generated=true
4041
cmd/workspace/alerts/alerts.go linguist-generated=true
4142
cmd/workspace/apps/apps.go linguist-generated=true
4243
cmd/workspace/artifact-allowlists/artifact-allowlists.go linguist-generated=true

.github/workflows/tagging.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,4 +48,4 @@ jobs:
4848
GITHUB_TOKEN: ${{ steps.generate-token.outputs.token }}
4949
GITHUB_REPOSITORY: ${{ github.repository }}
5050
run: |
51-
python internal/genkit/tagging.py
51+
python tagging.py

bundle/internal/schema/annotations_openapi.yml

Lines changed: 35 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -349,12 +349,10 @@ github.com/databricks/cli/bundle/config/resources.Job:
349349
Job-level parameter definitions
350350
"performance_target":
351351
"description": |-
352-
The performance mode on a serverless job. The performance target determines the level of compute performance or cost-efficiency for the run.
352+
The performance mode on a serverless job. This field determines the level of compute performance or cost-efficiency for the run.
353353
354354
* `STANDARD`: Enables cost-efficient execution of serverless workloads.
355355
* `PERFORMANCE_OPTIMIZED`: Prioritizes fast startup and execution times through rapid scaling and optimized cluster performance.
356-
"x-databricks-preview": |-
357-
PRIVATE
358356
"queue":
359357
"description": |-
360358
The queue settings of the job.
@@ -1410,7 +1408,8 @@ github.com/databricks/databricks-sdk-go/service/compute.EbsVolumeType:
14101408
github.com/databricks/databricks-sdk-go/service/compute.Environment:
14111409
"_":
14121410
"description": |-
1413-
The environment entity used to preserve serverless environment side panel and jobs' environment for non-notebook task.
1411+
The environment entity used to preserve serverless environment side panel, jobs' environment for non-notebook task, and DLT's environment for classic and serverless pipelines.
1412+
(Note: DLT uses a copied version of the Environment proto below, at //spark/pipelines/api/protos/copied/libraries-environments-copy.proto)
14141413
In this minimal environment spec, only pip dependencies are supported.
14151414
"client":
14161415
"description": |-
@@ -1981,7 +1980,8 @@ github.com/databricks/databricks-sdk-go/service/jobs.JobEnvironment:
19811980
The key of an environment. It has to be unique within a job.
19821981
"spec":
19831982
"description": |-
1984-
The environment entity used to preserve serverless environment side panel and jobs' environment for non-notebook task.
1983+
The environment entity used to preserve serverless environment side panel, jobs' environment for non-notebook task, and DLT's environment for classic and serverless pipelines.
1984+
(Note: DLT uses a copied version of the Environment proto below, at //spark/pipelines/api/protos/copied/libraries-environments-copy.proto)
19851985
In this minimal environment spec, only pip dependencies are supported.
19861986
github.com/databricks/databricks-sdk-go/service/jobs.JobNotificationSettings:
19871987
"no_alert_for_canceled_runs":
@@ -2583,8 +2583,6 @@ github.com/databricks/databricks-sdk-go/service/jobs.Task:
25832583
"power_bi_task":
25842584
"description": |-
25852585
The task triggers a Power BI semantic model update when the `power_bi_task` field is present.
2586-
"x-databricks-preview": |-
2587-
PRIVATE
25882586
"python_wheel_task":
25892587
"description": |-
25902588
The task runs a Python wheel when the `python_wheel_task` field is present.
@@ -2819,7 +2817,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek:
28192817
SUNDAY
28202818
github.com/databricks/databricks-sdk-go/service/pipelines.DeploymentKind:
28212819
"_":
2822-
"description": |
2820+
"description": |-
28232821
The deployment method that manages the pipeline:
28242822
- BUNDLE: The pipeline is managed by a Databricks Asset Bundle.
28252823
"enum":
@@ -2841,7 +2839,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.EventLogSpec:
28412839
github.com/databricks/databricks-sdk-go/service/pipelines.FileLibrary:
28422840
"path":
28432841
"description": |-
2844-
The absolute path of the file.
2842+
The absolute path of the source code.
28452843
github.com/databricks/databricks-sdk-go/service/pipelines.Filters:
28462844
"exclude":
28472845
"description": |-
@@ -2872,7 +2870,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.IngestionGatewayPipeli
28722870
"description": |-
28732871
Required, Immutable. The name of the catalog for the gateway pipeline's storage location.
28742872
"gateway_storage_name":
2875-
"description": |
2873+
"description": |-
28762874
Optional. The Unity Catalog-compatible name for the gateway storage location.
28772875
This is the destination to use for the data that is extracted by the gateway.
28782876
Delta Live Tables system will automatically create the storage location under the catalog and schema.
@@ -2896,10 +2894,10 @@ github.com/databricks/databricks-sdk-go/service/pipelines.ManualTrigger: {}
28962894
github.com/databricks/databricks-sdk-go/service/pipelines.NotebookLibrary:
28972895
"path":
28982896
"description": |-
2899-
The absolute path of the notebook.
2897+
The absolute path of the source code.
29002898
github.com/databricks/databricks-sdk-go/service/pipelines.Notifications:
29012899
"alerts":
2902-
"description": |
2900+
"description": |-
29032901
A list of alerts that trigger the sending of notifications to the configured
29042902
destinations. The supported alerts are:
29052903
@@ -2908,7 +2906,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.Notifications:
29082906
* `on-update-fatal-failure`: A pipeline update fails with a non-retryable (fatal) error.
29092907
* `on-flow-failure`: A single data flow fails.
29102908
"email_recipients":
2911-
"description": |
2909+
"description": |-
29122910
A list of email addresses notified when a configured alert is triggered.
29132911
github.com/databricks/databricks-sdk-go/service/pipelines.PipelineCluster:
29142912
"apply_policy_default_values":
@@ -2927,7 +2925,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineCluster:
29272925
Attributes related to clusters running on Microsoft Azure.
29282926
If not specified at cluster creation, a set of default values will be used.
29292927
"cluster_log_conf":
2930-
"description": |
2928+
"description": |-
29312929
The configuration for delivering spark logs to a long-term storage destination.
29322930
Only dbfs destinations are supported. Only one destination can be specified
29332931
for one cluster. If the conf is given, the logs will be delivered to the destination every
@@ -2968,7 +2966,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineCluster:
29682966
"description": |-
29692967
A label for the cluster specification, either `default` to configure the default cluster, or `maintenance` to configure the maintenance cluster. This field is optional. The default value is `default`.
29702968
"node_type_id":
2971-
"description": |
2969+
"description": |-
29722970
This field encodes, through a single value, the resources available to each of
29732971
the Spark nodes in this cluster. For example, the Spark nodes can be provisioned
29742972
and optimized for memory or compute intensive workloads. A list of available node
@@ -2987,7 +2985,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineCluster:
29872985
"description": |-
29882986
The ID of the cluster policy used to create the cluster if applicable.
29892987
"spark_conf":
2990-
"description": |
2988+
"description": |-
29912989
An object containing a set of optional, user-specified Spark configuration key-value pairs.
29922990
See :method:clusters/create for more details.
29932991
"spark_env_vars":
@@ -3017,15 +3015,15 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineClusterAutosca
30173015
The minimum number of workers the cluster can scale down to when underutilized.
30183016
It is also the initial number of workers the cluster will have after creation.
30193017
"mode":
3020-
"description": |
3018+
"description": |-
30213019
Databricks Enhanced Autoscaling optimizes cluster utilization by automatically
30223020
allocating cluster resources based on workload volume, with minimal impact to
30233021
the data processing latency of your pipelines. Enhanced Autoscaling is available
30243022
for `updates` clusters only. The legacy autoscaling feature is used for `maintenance`
30253023
clusters.
30263024
github.com/databricks/databricks-sdk-go/service/pipelines.PipelineClusterAutoscaleMode:
30273025
"_":
3028-
"description": |
3026+
"description": |-
30293027
Databricks Enhanced Autoscaling optimizes cluster utilization by automatically
30303028
allocating cluster resources based on workload volume, with minimal impact to
30313029
the data processing latency of your pipelines. Enhanced Autoscaling is available
@@ -3045,20 +3043,20 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineDeployment:
30453043
The path to the file containing metadata about the deployment.
30463044
github.com/databricks/databricks-sdk-go/service/pipelines.PipelineLibrary:
30473045
"file":
3048-
"description": |
3046+
"description": |-
30493047
The path to a file that defines a pipeline and is stored in the Databricks Repos.
30503048
"jar":
3051-
"description": |
3049+
"description": |-
30523050
URI of the jar to be installed. Currently only DBFS is supported.
30533051
"x-databricks-preview": |-
30543052
PRIVATE
30553053
"maven":
3056-
"description": |
3054+
"description": |-
30573055
Specification of a maven library to be installed.
30583056
"x-databricks-preview": |-
30593057
PRIVATE
30603058
"notebook":
3061-
"description": |
3059+
"description": |-
30623060
The path to a notebook that defines a pipeline and is stored in the Databricks workspace.
30633061
"whl":
30643062
"description": |-
@@ -3148,6 +3146,19 @@ github.com/databricks/databricks-sdk-go/service/pipelines.TableSpec:
31483146
"description": |-
31493147
Configuration settings to control the ingestion of tables. These settings override the table_configuration defined in the IngestionPipelineDefinition object and the SchemaSpec.
31503148
github.com/databricks/databricks-sdk-go/service/pipelines.TableSpecificConfig:
3149+
"exclude_columns":
3150+
"description": |-
3151+
A list of column names to be excluded for the ingestion.
3152+
When not specified, include_columns fully controls what columns to be ingested.
3153+
When specified, all other columns including future ones will be automatically included for ingestion.
3154+
This field in mutually exclusive with `include_columns`.
3155+
"include_columns":
3156+
"description": |-
3157+
A list of column names to be included for the ingestion.
3158+
When not specified, all columns except ones in exclude_columns will be included. Future
3159+
columns will be automatically included.
3160+
When specified, all other future columns will be automatically excluded from ingestion.
3161+
This field in mutually exclusive with `exclude_columns`.
31513162
"primary_keys":
31523163
"description": |-
31533164
The primary key of the table used to apply changes.
@@ -3692,7 +3703,7 @@ github.com/databricks/databricks-sdk-go/service/serving.ServedEntityInput:
36923703
Whether the compute resources for the served entity should scale down to zero.
36933704
"workload_size":
36943705
"description": |-
3695-
The workload size of the served entity. The workload size corresponds to a range of provisioned concurrency that the compute autoscales between. A single unit of provisioned concurrency can process one request at a time. Valid workload sizes are "Small" (4 - 4 provisioned concurrency), "Medium" (8 - 16 provisioned concurrency), and "Large" (16 - 64 provisioned concurrency). If scale-to-zero is enabled, the lower bound of the provisioned concurrency for each workload size is 0.
3706+
The workload size of the served entity. The workload size corresponds to a range of provisioned concurrency that the compute autoscales between. A single unit of provisioned concurrency can process one request at a time. Valid workload sizes are "Small" (4 - 4 provisioned concurrency), "Medium" (8 - 16 provisioned concurrency), and "Large" (16 - 64 provisioned concurrency). Additional custom workload sizes can also be used when available in the workspace. If scale-to-zero is enabled, the lower bound of the provisioned concurrency for each workload size is 0.
36963707
"workload_type":
36973708
"description": |-
36983709
The workload type of the served entity. The workload type selects which type of compute to use in the endpoint. The default value for this parameter is "CPU". For deep learning workloads, GPU acceleration is available by selecting workload types like GPU_SMALL and others. See the available [GPU types](https://docs.databricks.com/en/machine-learning/model-serving/create-manage-serving-endpoints.html#gpu-workload-types).
@@ -3719,19 +3730,10 @@ github.com/databricks/databricks-sdk-go/service/serving.ServedModelInput:
37193730
Whether the compute resources for the served entity should scale down to zero.
37203731
"workload_size":
37213732
"description": |-
3722-
The workload size of the served entity. The workload size corresponds to a range of provisioned concurrency that the compute autoscales between. A single unit of provisioned concurrency can process one request at a time. Valid workload sizes are "Small" (4 - 4 provisioned concurrency), "Medium" (8 - 16 provisioned concurrency), and "Large" (16 - 64 provisioned concurrency). If scale-to-zero is enabled, the lower bound of the provisioned concurrency for each workload size is 0.
3733+
The workload size of the served entity. The workload size corresponds to a range of provisioned concurrency that the compute autoscales between. A single unit of provisioned concurrency can process one request at a time. Valid workload sizes are "Small" (4 - 4 provisioned concurrency), "Medium" (8 - 16 provisioned concurrency), and "Large" (16 - 64 provisioned concurrency). Additional custom workload sizes can also be used when available in the workspace. If scale-to-zero is enabled, the lower bound of the provisioned concurrency for each workload size is 0.
37233734
"workload_type":
37243735
"description": |-
37253736
The workload type of the served entity. The workload type selects which type of compute to use in the endpoint. The default value for this parameter is "CPU". For deep learning workloads, GPU acceleration is available by selecting workload types like GPU_SMALL and others. See the available [GPU types](https://docs.databricks.com/en/machine-learning/model-serving/create-manage-serving-endpoints.html#gpu-workload-types).
3726-
github.com/databricks/databricks-sdk-go/service/serving.ServedModelInputWorkloadSize:
3727-
"_":
3728-
"enum":
3729-
- |-
3730-
Small
3731-
- |-
3732-
Medium
3733-
- |-
3734-
Large
37353737
github.com/databricks/databricks-sdk-go/service/serving.ServedModelInputWorkloadType:
37363738
"_":
37373739
"description": |-

bundle/run/app.go

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ func (a *appRunner) deploy(ctx context.Context) error {
155155
sourceCodePath := app.SourceCodePath
156156
wait, err := w.Apps.Deploy(ctx, apps.CreateAppDeploymentRequest{
157157
AppName: app.Name,
158-
AppDeployment: &apps.AppDeployment{
158+
AppDeployment: apps.AppDeployment{
159159
Mode: apps.AppDeploymentModeSnapshot,
160160
SourceCodePath: sourceCodePath,
161161
},
@@ -176,7 +176,7 @@ func (a *appRunner) deploy(ctx context.Context) error {
176176
// Now we can try to deploy the app again
177177
wait, err = w.Apps.Deploy(ctx, apps.CreateAppDeploymentRequest{
178178
AppName: app.Name,
179-
AppDeployment: &apps.AppDeployment{
179+
AppDeployment: apps.AppDeployment{
180180
Mode: apps.AppDeploymentModeSnapshot,
181181
SourceCodePath: sourceCodePath,
182182
},

bundle/run/app_test.go

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,7 @@ func setupTestApp(t *testing.T, initialAppState apps.ApplicationState, initialCo
110110
}
111111
appApi.EXPECT().Deploy(mock.Anything, apps.CreateAppDeploymentRequest{
112112
AppName: "my_app",
113-
AppDeployment: &apps.AppDeployment{
113+
AppDeployment: apps.AppDeployment{
114114
Mode: apps.AppDeploymentModeSnapshot,
115115
SourceCodePath: "/Workspace/Users/foo@bar.com/files/my_app",
116116
},
@@ -213,7 +213,7 @@ func TestAppDeployWithDeploymentInProgress(t *testing.T) {
213213
// First deployment fails
214214
appApi.EXPECT().Deploy(mock.Anything, apps.CreateAppDeploymentRequest{
215215
AppName: "my_app",
216-
AppDeployment: &apps.AppDeployment{
216+
AppDeployment: apps.AppDeployment{
217217
Mode: apps.AppDeploymentModeSnapshot,
218218
SourceCodePath: "/Workspace/Users/foo@bar.com/files/my_app",
219219
},
@@ -237,7 +237,7 @@ func TestAppDeployWithDeploymentInProgress(t *testing.T) {
237237
// Second one should succeeed
238238
appApi.EXPECT().Deploy(mock.Anything, apps.CreateAppDeploymentRequest{
239239
AppName: "my_app",
240-
AppDeployment: &apps.AppDeployment{
240+
AppDeployment: apps.AppDeployment{
241241
Mode: apps.AppDeploymentModeSnapshot,
242242
SourceCodePath: "/Workspace/Users/foo@bar.com/files/my_app",
243243
},

0 commit comments

Comments
 (0)