Skip to content

Commit 547b204

Browse files
[Release] Release v0.105.0
## Release v0.105.0 ### New Features and Improvements * Added automatic detection of AI coding agents (Amp, Antigravity, Augment, Claude Code, Cline, Codex, Copilot CLI, Copilot VS Code, Cursor, Gemini CLI, Goose, Kiro, OpenClaw, OpenCode, Windsurf) in the user-agent string. The SDK now appends `agent/<name>` to HTTP request headers when running inside a known AI agent environment. Also honors the `AGENT=<name>` standard: when `AGENT` is set to a known product name the SDK reports that product, and when set to an unrecognized non-empty value the SDK reports `agent/unknown`. Environment variables set to the empty string (e.g. `CLAUDECODE=""`) now count as "set" for presence-only matchers, matching `databricks-sdk-go` semantics; previously they were treated as unset. Explicit agent env vars (e.g. `CLAUDECODE`, `GOOSE_TERMINAL`) always take precedence over the generic `AGENT=<name>` signal. When multiple agent env vars are present (e.g. a Cursor CLI subagent invoked from Claude Code), the user-agent reports `agent/multiple`. ### Breaking Changes * Remove the `experimentalIsUnifiedHost` field (and the `DATABRICKS_EXPERIMENTAL_IS_UNIFIED_HOST` environment variable) from `DatabricksConfig`. The flag was unused — `getHostType()` never read it — so unified host detection is now determined purely by URL pattern and the automatic `/.well-known/databricks-config` metadata resolution. Callers that set the flag should remove those calls. Mirrors [databricks/databricks-sdk-go#1641](databricks/databricks-sdk-go#1641) and [databricks/databricks-sdk-py#1358](databricks/databricks-sdk-py#1358). ### Bug Fixes * Add `X-Databricks-Org-Id` header to `SharesExtImpl.list()` for SPOG host compatibility. Without this header, calls to the hand-written extension were rejected by the SPOG proxy with `Unable to load OAuth Config (400 UNKNOWN)`. Mirrors [databricks/databricks-sdk-go#1635](databricks/databricks-sdk-go#1635). ### Internal Changes * Added parametrized unit tests covering PAT, Basic, OAuth M2M, GitHub OIDC, Env OIDC, File OIDC, Azure Client Secret, and Azure GitHub OIDC against six host profiles (LW, NW, LA, NA, SPOGW, SPOGA) across AWS, Azure, and GCP (138 subtests total). Mirrors databricks-sdk-go PR #1627 and databricks-sdk-py PR #1357. * Migrated internal SDK classes to the logging abstraction. The SDK now supports SLF4J, `java.util.logging`, or a custom backend via `LoggerFactory.setDefault()`. ### API Changes * Add `com.databricks.sdk.service.supervisoragents` package. * Add `workspaceClient.secretsUc()` service. * Add `workspaceClient.supervisorAgents()` service. * Add `update()` method for `workspaceClient.tokens()` service. * Add `etag` field for `com.databricks.sdk.service.dashboards.GenieSpace`. * Add `etag` field for `com.databricks.sdk.service.dashboards.GenieUpdateSpaceRequest`. * Add `branchId` field for `com.databricks.sdk.service.postgres.BranchStatus`. * Add `catalogId` field for `com.databricks.sdk.service.postgres.CatalogCatalogStatus`. * Add `databaseId` field for `com.databricks.sdk.service.postgres.DatabaseDatabaseStatus`. * Add `endpointId` field for `com.databricks.sdk.service.postgres.EndpointStatus`. * Add `projectId` field for `com.databricks.sdk.service.postgres.ProjectStatus`. * Add `roleId` field for `com.databricks.sdk.service.postgres.RoleRoleStatus`. * Add `project` field for `com.databricks.sdk.service.postgres.SyncedTableSyncedTableStatus`. * Add `manual` field for `com.databricks.sdk.service.provisioning.CreateGcpKeyInfo`. * Add `manual` field for `com.databricks.sdk.service.provisioning.GcpKeyInfo`. * Add `appsRuntime` and `lakebaseRuntime` fields for `com.databricks.sdk.service.settings.CustomerFacingIngressNetworkPolicyRequestDestination`. * Add `blockedInternetDestinations` field for `com.databricks.sdk.service.settings.EgressNetworkPolicyNetworkAccessPolicy`. * Add `columnsToSync` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Add `BREAKING_CHANGE` enum value for `com.databricks.sdk.service.jobs.TerminationCodeCode`. * [Breaking] Change `updateCatalogConfig()` method for `workspaceClient.dataClassification()` service. Method path has changed. * [Breaking] Change `updateDefaultWorkspaceBaseEnvironment()` method for `workspaceClient.environments()` service. Method path has changed. * [Breaking] Change `updateKnowledgeAssistant()` method for `workspaceClient.knowledgeAssistants()` service. Method path has changed. * [Breaking] Change `updateBranch()`, `updateDatabase()`, `updateEndpoint()`, `updateProject()` and `updateRole()` methods for `workspaceClient.postgres()` service. Method path has changed. * [Breaking] Change `updateDefaultWarehouseOverride()` method for `workspaceClient.warehouses()` service. Method path has changed.
1 parent 0c2318d commit 547b204

11 files changed

Lines changed: 52 additions & 39 deletions

File tree

.release_metadata.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
"timestamp": "2026-04-20 08:34:28+0000"
2+
"timestamp": "2026-04-23 09:00:48+0000"
33
}

CHANGELOG.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,47 @@
11
# Version changelog
22

3+
## Release v0.105.0 (2026-04-23)
4+
5+
### New Features and Improvements
6+
* Added automatic detection of AI coding agents (Amp, Antigravity, Augment, Claude Code, Cline, Codex, Copilot CLI, Copilot VS Code, Cursor, Gemini CLI, Goose, Kiro, OpenClaw, OpenCode, Windsurf) in the user-agent string. The SDK now appends `agent/<name>` to HTTP request headers when running inside a known AI agent environment. Also honors the `AGENT=<name>` standard: when `AGENT` is set to a known product name the SDK reports that product, and when set to an unrecognized non-empty value the SDK reports `agent/unknown`. Environment variables set to the empty string (e.g. `CLAUDECODE=""`) now count as "set" for presence-only matchers, matching `databricks-sdk-go` semantics; previously they were treated as unset. Explicit agent env vars (e.g. `CLAUDECODE`, `GOOSE_TERMINAL`) always take precedence over the generic `AGENT=<name>` signal. When multiple agent env vars are present (e.g. a Cursor CLI subagent invoked from Claude Code), the user-agent reports `agent/multiple`.
7+
8+
### Breaking Changes
9+
* Remove the `experimentalIsUnifiedHost` field (and the `DATABRICKS_EXPERIMENTAL_IS_UNIFIED_HOST` environment variable) from `DatabricksConfig`. The flag was unused — `getHostType()` never read it — so unified host detection is now determined purely by URL pattern and the automatic `/.well-known/databricks-config` metadata resolution. Callers that set the flag should remove those calls. Mirrors [databricks/databricks-sdk-go#1641](https://github.com/databricks/databricks-sdk-go/pull/1641) and [databricks/databricks-sdk-py#1358](https://github.com/databricks/databricks-sdk-py/pull/1358).
10+
11+
### Bug Fixes
12+
* Add `X-Databricks-Org-Id` header to `SharesExtImpl.list()` for SPOG host compatibility. Without this header, calls to the hand-written extension were rejected by the SPOG proxy with `Unable to load OAuth Config (400 UNKNOWN)`. Mirrors [databricks/databricks-sdk-go#1635](https://github.com/databricks/databricks-sdk-go/pull/1635).
13+
14+
### Internal Changes
15+
* Added parametrized unit tests covering PAT, Basic, OAuth M2M, GitHub OIDC, Env OIDC, File OIDC, Azure Client Secret, and Azure GitHub OIDC against six host profiles (LW, NW, LA, NA, SPOGW, SPOGA) across AWS, Azure, and GCP (138 subtests total). Mirrors databricks-sdk-go PR #1627 and databricks-sdk-py PR #1357.
16+
* Migrated internal SDK classes to the logging abstraction. The SDK now supports SLF4J, `java.util.logging`, or a custom backend via `LoggerFactory.setDefault()`.
17+
18+
### API Changes
19+
* Add `com.databricks.sdk.service.supervisoragents` package.
20+
* Add `workspaceClient.secretsUc()` service.
21+
* Add `workspaceClient.supervisorAgents()` service.
22+
* Add `update()` method for `workspaceClient.tokens()` service.
23+
* Add `etag` field for `com.databricks.sdk.service.dashboards.GenieSpace`.
24+
* Add `etag` field for `com.databricks.sdk.service.dashboards.GenieUpdateSpaceRequest`.
25+
* Add `branchId` field for `com.databricks.sdk.service.postgres.BranchStatus`.
26+
* Add `catalogId` field for `com.databricks.sdk.service.postgres.CatalogCatalogStatus`.
27+
* Add `databaseId` field for `com.databricks.sdk.service.postgres.DatabaseDatabaseStatus`.
28+
* Add `endpointId` field for `com.databricks.sdk.service.postgres.EndpointStatus`.
29+
* Add `projectId` field for `com.databricks.sdk.service.postgres.ProjectStatus`.
30+
* Add `roleId` field for `com.databricks.sdk.service.postgres.RoleRoleStatus`.
31+
* Add `project` field for `com.databricks.sdk.service.postgres.SyncedTableSyncedTableStatus`.
32+
* Add `manual` field for `com.databricks.sdk.service.provisioning.CreateGcpKeyInfo`.
33+
* Add `manual` field for `com.databricks.sdk.service.provisioning.GcpKeyInfo`.
34+
* Add `appsRuntime` and `lakebaseRuntime` fields for `com.databricks.sdk.service.settings.CustomerFacingIngressNetworkPolicyRequestDestination`.
35+
* Add `blockedInternetDestinations` field for `com.databricks.sdk.service.settings.EgressNetworkPolicyNetworkAccessPolicy`.
36+
* Add `columnsToSync` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`.
37+
* Add `BREAKING_CHANGE` enum value for `com.databricks.sdk.service.jobs.TerminationCodeCode`.
38+
* [Breaking] Change `updateCatalogConfig()` method for `workspaceClient.dataClassification()` service. Method path has changed.
39+
* [Breaking] Change `updateDefaultWorkspaceBaseEnvironment()` method for `workspaceClient.environments()` service. Method path has changed.
40+
* [Breaking] Change `updateKnowledgeAssistant()` method for `workspaceClient.knowledgeAssistants()` service. Method path has changed.
41+
* [Breaking] Change `updateBranch()`, `updateDatabase()`, `updateEndpoint()`, `updateProject()` and `updateRole()` methods for `workspaceClient.postgres()` service. Method path has changed.
42+
* [Breaking] Change `updateDefaultWarehouseOverride()` method for `workspaceClient.warehouses()` service. Method path has changed.
43+
44+
345
## Release v0.104.0 (2026-04-20)
446

547
### New Features and Improvements

NEXT_CHANGELOG.md

Lines changed: 1 addition & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,17 @@
11
# NEXT CHANGELOG
22

3-
## Release v0.105.0
3+
## Release v0.106.0
44

55
### New Features and Improvements
6-
* Added automatic detection of AI coding agents (Amp, Antigravity, Augment, Claude Code, Cline, Codex, Copilot CLI, Copilot VS Code, Cursor, Gemini CLI, Goose, Kiro, OpenClaw, OpenCode, Windsurf) in the user-agent string. The SDK now appends `agent/<name>` to HTTP request headers when running inside a known AI agent environment. Also honors the `AGENT=<name>` standard: when `AGENT` is set to a known product name the SDK reports that product, and when set to an unrecognized non-empty value the SDK reports `agent/unknown`. Environment variables set to the empty string (e.g. `CLAUDECODE=""`) now count as "set" for presence-only matchers, matching `databricks-sdk-go` semantics; previously they were treated as unset. Explicit agent env vars (e.g. `CLAUDECODE`, `GOOSE_TERMINAL`) always take precedence over the generic `AGENT=<name>` signal. When multiple agent env vars are present (e.g. a Cursor CLI subagent invoked from Claude Code), the user-agent reports `agent/multiple`.
76

87
### Breaking Changes
9-
* Remove the `experimentalIsUnifiedHost` field (and the `DATABRICKS_EXPERIMENTAL_IS_UNIFIED_HOST` environment variable) from `DatabricksConfig`. The flag was unused — `getHostType()` never read it — so unified host detection is now determined purely by URL pattern and the automatic `/.well-known/databricks-config` metadata resolution. Callers that set the flag should remove those calls. Mirrors [databricks/databricks-sdk-go#1641](https://github.com/databricks/databricks-sdk-go/pull/1641) and [databricks/databricks-sdk-py#1358](https://github.com/databricks/databricks-sdk-py/pull/1358).
108

119
### Bug Fixes
12-
* Add `X-Databricks-Org-Id` header to `SharesExtImpl.list()` for SPOG host compatibility. Without this header, calls to the hand-written extension were rejected by the SPOG proxy with `Unable to load OAuth Config (400 UNKNOWN)`. Mirrors [databricks/databricks-sdk-go#1635](https://github.com/databricks/databricks-sdk-go/pull/1635).
1310

1411
### Security Vulnerabilities
1512

1613
### Documentation
1714

1815
### Internal Changes
19-
* Added parametrized unit tests covering PAT, Basic, OAuth M2M, GitHub OIDC, Env OIDC, File OIDC, Azure Client Secret, and Azure GitHub OIDC against six host profiles (LW, NW, LA, NA, SPOGW, SPOGA) across AWS, Azure, and GCP (138 subtests total). Mirrors databricks-sdk-go PR #1627 and databricks-sdk-py PR #1357.
20-
* Migrated internal SDK classes to the logging abstraction. The SDK now supports SLF4J, `java.util.logging`, or a custom backend via `LoggerFactory.setDefault()`.
2116

2217
### API Changes
23-
* Add `com.databricks.sdk.service.supervisoragents` package.
24-
* Add `workspaceClient.secretsUc()` service.
25-
* Add `workspaceClient.supervisorAgents()` service.
26-
* Add `update()` method for `workspaceClient.tokens()` service.
27-
* Add `etag` field for `com.databricks.sdk.service.dashboards.GenieSpace`.
28-
* Add `etag` field for `com.databricks.sdk.service.dashboards.GenieUpdateSpaceRequest`.
29-
* Add `branchId` field for `com.databricks.sdk.service.postgres.BranchStatus`.
30-
* Add `catalogId` field for `com.databricks.sdk.service.postgres.CatalogCatalogStatus`.
31-
* Add `databaseId` field for `com.databricks.sdk.service.postgres.DatabaseDatabaseStatus`.
32-
* Add `endpointId` field for `com.databricks.sdk.service.postgres.EndpointStatus`.
33-
* Add `projectId` field for `com.databricks.sdk.service.postgres.ProjectStatus`.
34-
* Add `roleId` field for `com.databricks.sdk.service.postgres.RoleRoleStatus`.
35-
* Add `project` field for `com.databricks.sdk.service.postgres.SyncedTableSyncedTableStatus`.
36-
* Add `manual` field for `com.databricks.sdk.service.provisioning.CreateGcpKeyInfo`.
37-
* Add `manual` field for `com.databricks.sdk.service.provisioning.GcpKeyInfo`.
38-
* Add `appsRuntime` and `lakebaseRuntime` fields for `com.databricks.sdk.service.settings.CustomerFacingIngressNetworkPolicyRequestDestination`.
39-
* Add `blockedInternetDestinations` field for `com.databricks.sdk.service.settings.EgressNetworkPolicyNetworkAccessPolicy`.
40-
* Add `columnsToSync` field for `com.databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`.
41-
* Add `BREAKING_CHANGE` enum value for `com.databricks.sdk.service.jobs.TerminationCodeCode`.
42-
* [Breaking] Change `updateCatalogConfig()` method for `workspaceClient.dataClassification()` service. Method path has changed.
43-
* [Breaking] Change `updateDefaultWorkspaceBaseEnvironment()` method for `workspaceClient.environments()` service. Method path has changed.
44-
* [Breaking] Change `updateKnowledgeAssistant()` method for `workspaceClient.knowledgeAssistants()` service. Method path has changed.
45-
* [Breaking] Change `updateBranch()`, `updateDatabase()`, `updateEndpoint()`, `updateProject()` and `updateRole()` methods for `workspaceClient.postgres()` service. Method path has changed.
46-
* [Breaking] Change `updateDefaultWarehouseOverride()` method for `workspaceClient.warehouses()` service. Method path has changed.

databricks-sdk-java/lockfile.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"artifactId": "databricks-sdk-java",
33
"groupId": "com.databricks",
4-
"version": "0.104.0",
4+
"version": "0.105.0",
55
"lockFileVersion": 1,
66
"dependencies": [
77
{

databricks-sdk-java/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
<parent>
66
<groupId>com.databricks</groupId>
77
<artifactId>databricks-sdk-parent</artifactId>
8-
<version>0.104.0</version>
8+
<version>0.105.0</version>
99
</parent>
1010
<artifactId>databricks-sdk-java</artifactId>
1111
<name>Databricks SDK for Java</name>

databricks-sdk-java/src/main/java/com/databricks/sdk/core/UserAgent.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ public String getValue() {
3636
// TODO: check if reading from
3737
// /META-INF/maven/com.databricks/databrics-sdk-java/pom.properties
3838
// or getClass().getPackage().getImplementationVersion() is enough.
39-
private static final String version = "0.104.0";
39+
private static final String version = "0.105.0";
4040

4141
public static void withProduct(String product, String productVersion) {
4242
UserAgent.product = product;

examples/docs/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
<dependency>
2525
<groupId>com.databricks</groupId>
2626
<artifactId>databricks-sdk-java</artifactId>
27-
<version>0.104.0</version>
27+
<version>0.105.0</version>
2828
</dependency>
2929
</dependencies>
3030
</project>

examples/spring-boot-oauth-u2m-demo/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@
4040
<dependency>
4141
<groupId>com.databricks</groupId>
4242
<artifactId>databricks-sdk-java</artifactId>
43-
<version>0.104.0</version>
43+
<version>0.105.0</version>
4444
</dependency>
4545
</dependencies>
4646
</project>

lockfile.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"artifactId": "databricks-sdk-parent",
33
"groupId": "com.databricks",
4-
"version": "0.104.0",
4+
"version": "0.105.0",
55
"lockFileVersion": 1,
66
"dependencies": [],
77
"mavenPlugins": [],

pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
<modelVersion>4.0.0</modelVersion>
55
<groupId>com.databricks</groupId>
66
<artifactId>databricks-sdk-parent</artifactId>
7-
<version>0.104.0</version>
7+
<version>0.105.0</version>
88
<packaging>pom</packaging>
99
<name>Databricks SDK for Java</name>
1010
<description>The Databricks SDK for Java includes functionality to accelerate development with Java for

0 commit comments

Comments
 (0)