Latest releases from the Databricks ecosystem.
New versions from the official SDKs, CLI and Asset Bundles, Terraform provider, Unity Catalog, MLflow, Delta, dbt-databricks, and more. Summarized for scanning.
This week
3 releasesThis release introduces new resources for managing disaster recovery failover groups, stable URLs, supervisor agents, and UC secrets. It also adds support for adopting pre-existing Postgres branch and endpoint resources.
MLflow 3.12.0rc0 introduces enhanced AI agent development features, including automatic tracing for more AI coding assistants and OpenClaw, along with new AI Gateway guardrails for safety checks. It also adds multimodal trace attachments for images, audio, and files, and a new mlflow.diffusers flavor for diffusion models.
This release introduces a new disaster recovery package and adds methods for managing knowledge assistant examples. It also includes breaking changes related to the `supervisoragents.Connection` and `supervisoragents.Tool` fields.
Last week
13 releasesUnity Catalog AI 0.4.0
DatabricksFunctionClient now supports an optional warehouse_id for function execution, enabling use in workspaces without serverless compute. Python 3.10+ is now required, and several bug fixes address Gemini toolkit, LangGraph, and OSS client function creation issues.
Unified host detection is now automatic, removing the `Experimental_IsUnifiedHost` field and enabling a single configuration profile for both account and workspace operations. The file-based OAuth token cache has been removed, defaulting to an in-memory cache unless a persistent cache is explicitly provided.
The SDK now automatically detects AI coding agents and appends agent information to HTTP request headers, while also removing the unused `experimentalIsUnifiedHost` field from `DatabricksConfig`. A bug fix addresses `X-Databricks-Org-Id` header issues for `SharesExtImpl.list()` on SPOG hosts, and several API method paths have changed, which are breaking changes.
WorkspaceExt upload/download and SharesExt list now include the X-Databricks-Org-Id header for SPOG host compatibility. WorkspaceClient.get_workspace_id avoids an API call when the workspace ID is already known, fixing a SPOG host failure.
The CLI now supports a --limit flag for paginated list commands and caches host metadata lookups for faster execution. The bundles feature adds support for Vector Search Endpoints and prompts before destroying Lakebase resources.
This release improves the new Datafusion TableProvider and log parsing performance, alongside numerous bug fixes. Key fixes address issues with DeltaScan schema handling, streamed merge file pruning, and incorrect row counts for DELETE operations.
This release introduces new workspace-level services for supervisor agents and Unity Catalog secrets, along with an update method for tokens. Several existing API methods for data classification, environments, knowledge assistants, Postgres, and warehouses have breaking changes due to path modifications.
This release adds support for Azure MSI authentication and consistent default profile resolution from .databrickscfg. It also fixes issues with non-JSON error responses and Databricks CLI token scope mismatches, while introducing new API methods for catalog, apps, genie, and pipelines services.
The SDK now allows installing a shared host metadata resolver globally and improves WorkspaceClient.CurrentWorkspaceID() to avoid an API call when the workspace ID is already known. Fixes include adding the X-Databricks-Org-Id header to Workspace.Download(), Workspace.Upload(), and SharesAPI.internalList() for SPOG host compatibility.
Nightly - fix/replace-softprops-action
Nightly build from fix/replace-softprops-action
This release automatically detects unified hosts for both account and workspace operations, and accepts `DATABRICKS_OIDC_TOKEN_FILEPATH` for OIDC tokens. It drops support for Python 3.8 and 3.9, requiring Python 3.10 or newer.
This release adds support for Notebook-scoped packages when submitting commands or running notebook jobs. It also includes fixes for workflow job creation, duplicate aliases in empty mode, and insert-by-name for microbatch and replace_where strategies.
Week of Apr 13
14 releasesThis release fixes an error where bundle commands failed due to an expired key when downloading Terraform. Users will no longer encounter the "unable to verify checksums signature: openpgp: key expired" error.
This release fixes an error where bundle commands failed due to an expired key when downloading Terraform. Users can now run bundle commands without encountering checksum signature verification issues.
This release fixes a "key expired" error that occurred when running Databricks bundle commands. The fix resolves issues with Terraform checksum signature verification during bundle operations.
This release fixes an error where bundle commands failed due to an expired OpenPGP key during Terraform downloads. Users can now run bundle commands without encountering checksum signature verification issues.
This release fixes a "key expired" error that occurred when running Databricks bundle commands. The issue prevented Terraform downloads due to checksum signature verification failures.
This release fixes an error where bundle commands failed due to an expired key when downloading Terraform. Users will no longer encounter the "unable to verify checksums signature: openpgp: key expired" error.
This release fixes an error where bundle commands failed due to an expired key when downloading Terraform. Users can now run bundle commands without encountering checksum signature verification issues.
This release fixes an error where bundle commands failed due to an expired key when downloading Terraform. Databricks practitioners can now use bundle commands without encountering checksum signature verification issues.
This release fixes a "key expired" error when running `databricks bundle deploy` by updating the Terraform binary installation process. It now uses a hardcoded ArmoredPublicKey to resolve checksum signature verification issues.
UnityCatalog 0.4.1
The Unity Catalog Spark connector now supports atomic REPLACE TABLE AS SELECT and Dynamic Partition Overwrite for managed Delta tables, and introduces a credential-scoped file system to prevent out-of-memory errors. A critical security fix addresses a JWT issuer validation bypass, requiring new server configurations for existing deployments with authorization enabled.
This release adds new resources for managing Postgres catalogs, synced tables, and workspace base environments. It also introduces an `api` field for dual account/workspace resources to explicitly control API level, supporting unified hosts like `api.databricks.com`.
Delta Lake 4.2.0
This release enhances Unity Catalog managed tables with support for REPLACE TABLE, RTAS, Dynamic Partition Overwrite, and improved streaming read options like `startingTimestamp` and `skipChangeCommits`. It also introduces general availability for Variant columns and adds support for Geospatial and Collations table features, alongside fixes for various data skipping and DML issues.
This release raises the minimum Go version to 1.24 and introduces new features like a host metadata resolver hook and a lazy iterator with item limits. It also includes numerous bug fixes for token acquisition and caching issues, alongside extensive API additions for various Databricks services.
TypeScript SDK 0.2.0 RC1
Release candidate for `@mlflow/vercel` TypeScript package with version 0.2.0: https://github.com/mlflow/mlflow/pull/22105
Week of Apr 6
2 releasesMLflow 3.11.1 introduces AI-powered issue detection for agent traces, budget alerts for AI Gateway spending, and a new interactive graph view for visualizing trace hierarchies. It also enhances security with pickle-free model serialization and improves dependency management with native UV support.
Model Catalog
Per-provider model catalog files. Updated weekly by CI.
Week of Mar 30
1 releaseWeek of Mar 16
14 releasesThis release adds a new method to force-refresh cached U2M OAuth tokens, returning an error on failure instead of falling back. It also introduces an error for when a token refresh is requested but no refresh token is available.
The Databricks SDK for Java now includes a `disableGovTagCreation` field in both v1 and v2 of the `RestrictWorkspaceAdminsMessage` settings. This allows programmatic control over the creation of governance tags within a workspace.
The SDK now automatically detects AI coding agents and appends `agent/<name>` to HTTP request headers. New `DisableGovTagCreation` fields were added to `settings.RestrictWorkspaceAdminsMessage` and `settingsv2.RestrictWorkspaceAdminsMessage`.
The SDK now automatically detects AI coding agents and appends agent information to HTTP request headers. Two new fields, `disable_gov_tag_creation`, were added to restrict workspace admin settings.
* Add resource and data sources for `databricks_postgres_role`.
* Add `parentPath` field for `com.databricks.sdk.service.dashboards.GenieSpace`.
OAuth token refreshing is now proactive for tokens expiring within five minutes, returning the existing token if refresh fails but it's still valid. The `dashboards.GenieSpace` struct now includes a `ParentPath` field.
This release introduces a new `environments` service for managing Databricks environments and adds a `parent_path` field to `GenieSpace` dashboards. It also includes a `can_create_app` permission level for IAM.
The Databricks SDK for Go now supports specifying a default profile within the `[__settings__]` section of your `.databrickscfg` file. This allows for easier configuration management when working with multiple Databricks profiles.
MLflow 3.11.0rc0 introduces AI-powered issue identification for agent traces, budget alerts for AI Gateway spending, and a new interactive graph view for trace hierarchies. It also adds pickle-free model serialization, UV package manager support, and native OpenTelemetry GenAI convention support for trace export.
Release: v2.10.6 (#1858)
This release fixes an issue where the VS Code extension would not correctly handle 404 errors from the Databricks SDK. This improves stability when interacting with Databricks resources.
Databricks Jobs now support new alert-related fields in run outputs and task definitions. The SDK also introduces a new `environments` package and service for managing workspace environments, alongside a new `CAN_CREATE_APP` permission level.
This release adds new fields for job alert configurations across various job-related structs, including `RunOutput`, `RunTask`, `SubmitTask`, and `Task`. It also introduces a new `environments` package and service for managing workspace environments, alongside a `CanCreateApp` permission level.
You can now specify a `default_profile` in the `[__settings__]` section of your `.databrickscfg` file for consistent profile resolution. Several job-related objects now include new `alert_output` and `alert_task` fields.
Week of Mar 9
11 releasespython-v1.5.0: faster writes, log compaction, spil config in MERGE
This release introduces faster Delta table writes through parallel partition writers and adds log compaction for improved performance. The `MERGE` operation now supports disk spilling for larger datasets, and `get_add_actions` returns an Arrow Table instead of a RecordBatch, which is a breaking change.
This release adds new fields for defining ingestion pipelines, including connector type, data staging options, and source catalog/schema/table details. It also introduces a `subDomain` field for external function requests in the serving API.
This release adds new fields and methods across several services, including updates for ML features, pipelines, and Postgres roles. It also introduces breaking changes by making previously required fields optional in ML-related DeltaTableSource, Feature, Function, and KafkaSource configurations.
This release adds new fields for defining ingestion pipelines, including connector type, data staging options, and detailed ingestion source information. It also introduces a `sub_domain` field for external function requests in the serving API.
This release enables concurrent microbatch execution and adds an optimize() call to snapshot materializations. It also fixes issues with quoting catalog names, applying column-level tags for V1 tables, and enforcing constraints.
This release adds new fields and methods for MLflow Feature Store and Postgres role management. Several fields in MLflow Feature Store related services are now optional, which is a breaking change for some existing integrations.
This release introduces new methods for the Genie workspace service and an update role method for the Postgres service. Several fields across ML and other services are now optional, with some of these changes being breaking.
This release introduces new resources for managing Postgres databases, data classification catalog configurations, and knowledge assistant features. It also renames the `databricks_apps_space` resource to `databricks_app_space`.
This release introduces new services for Data Classification and Knowledge Assistants, accessible via `workspaceClient`. It also adds several new methods for managing Genie evaluation runs and results.
Databricks SDK for Java now allows fine-grained control over HTTP request timeouts through a new `withRequestConfig` method on `CommonsHttpClient.Builder`. This enables practitioners to configure specific timeout settings for their HTTP client requests.
Databricks CLI authentication now correctly errors on token scope mismatches, prompting re-authentication instead of silently using incorrect permissions. New `dataclassification` and `knowledge_assistants` services and corresponding workspace-level APIs have been added.
Week of Mar 2
3 releasesThis release adds a "try-it" page for Gateway usage examples and filters gateway experiments from the experiment list in the UI. It also fixes numerous UI issues, artifact download problems, and tracing errors, including issues with artifact download when workspaces are enabled and a fix for copying Unity Catalog models across workspaces.
The SDK now dynamically adjusts authentication token stale periods, extending them up to 20 minutes for standard OAuth. Users can revert to a fixed stale duration by manually setting the period via the CachedTokenSource builder.
The SDK now dynamically adjusts authentication token refresh periods based on the token's initial lifetime, potentially increasing it to 20 minutes for standard OAuth. Users can still specify a fixed stale duration in the Refreshable class constructor, with timedelta(minutes=5) matching the previous default.
Week of Feb 23
3 releasesThis release introduces row filter functionality and support for metric views.
Delta Lake 4.1.0
Delta Lake 4.1.0 introduces enhanced support for Unity Catalog managed tables, including atomic CTAS and conflict-free feature enablement for Deletion Vectors and Column Mapping. It also requires Java 17 and Spark 4.0.1 or higher, dropping support for Spark 3.5.
Utility clusters auto-created by resources like `databricks_aws_s3_mount` now default to `SPOT_WITH_FALLBACK` for improved reliability. Plaintext credential fields in `databricks_model_serving` and `databricks_git_credential` are now marked sensitive to prevent display in plan/apply output.
Week of Feb 16
6 releasesMLflow 3.10.0 introduces multi-workspace support for organizing experiments and models, alongside new GenAI features like multi-turn evaluation, LLM cost tracking, and AI Gateway usage analytics. The UI has been redesigned for improved navigation, including a new workflow type selector and in-UI trace evaluation capabilities.
* Mark `effective_enable_file_events` as read-only in `databricks_external_location` to prevent Terraform drift.
This release fixes an issue where multiple foreign keys between tables were not retained after an incremental run. It also resolves a bug where changes to materialized view partition_by clauses failed to apply correctly.
You can now manage Lakebase database project permissions using `database_project_name` in `databricks_permissions` and configure `node_type_flexibility` for `databricks_instance_pool` resources. A bug was fixed that previously caused errors during WorkspaceClient() creation in `databricks_grant` and `databricks_grants` resources.
This release adds new Terraform resources and data sources for managing Databricks Apps Space and Endpoints. It also updates the underlying Go SDK to version 0.108.0.
This release provides a fix for a Rust-related issue, backporting a previous pull request. Databricks users will experience improved stability due to this underlying fix.
Week of Feb 9
5 releasesUnityCatalog 0.4.0
Unity Catalog 0.4.0 introduces full support for AWS Storage Credentials and External Locations, enabling secure, governed access to S3 data via temporary, scoped credentials. It also enables automatic credential renewal by default for long-running Spark jobs and adds atomic CTAS operations for Delta tables.
The `databricks_workspace_file` resource now supports payloads larger than 10MB, and `databricks_mws_storage_configurations` includes a `role_arn` field for S3 bucket sharing with Unity Catalog. Several bug fixes address issues with `databricks_mws_ncc_private_endpoint_rule` updates, `databricks_secret_acl` management, `databricks_app` resource reading after external deletion, and `databricks_users` data source `extra_attributes` parameter behavior.
MLflow now supports multi-workspace environments for organizing experiments and resources, alongside a new top-level navigation split for GenAI and Classical ML workflows. Key new features include multi-turn conversation simulation, automatic LLM trace cost tracking, AI Gateway usage analytics, and a CLI command to generate a demo MLflow environment.
This release introduces a session-first DataFusion integration and exposes Delta Lake Vacuum (DV) metadata as Arrow streams. It also fixes issues with schema merge appends for generated columns and improves parquet predicate pushdown.
This release fixes issues with file statistics alignment during Parquet reads and resolves DML predicates against the correct execution scan schema. It also updates the asserted nullability in the DataValidation output schema.
Week of Feb 2
1 releaseWeek of Jan 26
3 releasesRelease: v2.10.5 (#1834)
- Update Databricks CLI to v0.286.0
v.3.9.0
MLflow 3.9.0 introduces an in-product MLflow Assistant chatbot and a Trace Overview Dashboard for GenAI experiments, enhancing debugging and performance insights. The AI Gateway is revamped to be part of the tracking server, offering new features like passthrough endpoints and traffic splits, alongside new LLM judge capabilities for online monitoring and custom prompt building.
This release enables deletion vector features for Delta tables and improves logical planning for update and delete operations. It also fixes issues with reporting failed data in data checks and supporting user names in Azure URLs.
Week of Jan 19
2 releasesThis release adds new resources for account user settings, default warehouse overrides, and fixes issues with importing databricks_share and creating databricks_dashboard resources. The exporter now supports additional network policy resources and rewrites cloud-specific attributes in cluster policies.
This release updates the dbt-core pin for the 1.10.latest version. No new user-facing features, fixes, or breaking changes are included.
Week of Jan 12
6 releasesMLflow 3.9.0rc0 introduces an in-product AI Assistant for debugging and a new Trace Overview Dashboard for GenAI experiments. The AI Gateway is now integrated into the tracking server, and users can configure LLM judges for online monitoring and build custom judges directly in the UI.
Delta Lake 4.0.1
The "managed table" feature is renamed to `catalogManaged` and its associated Unity Catalog table ID property is updated, which is a breaking change. Unity Catalog now supports OAuth authentication for catalogs and creating UC-managed Delta tables where UC is the source of truth for table properties.
This release enables new table provider integration for query building and data sinks, along with migrating table scans. It also fixes an issue where deleting from an empty table would fail and improves data validation for deletion vectors.
This release updates internal dbt-common and dbt-adapter dependency pins for the 1.10.x series. There are no new user-facing features, fixes, or breaking changes in this version.
This release updates the dbt-core dependency pin. No user-facing features, fixes, or breaking changes are included.
This release adds a query-id to SQLQueryStatus for improved tracking. It also fixes an issue where hard_deletes incorrectly invalidated active records in snapshots and addresses a serverless Python model environment version configuration bug.
Week of Jan 5
2 releasespython-v1.3.1: read support deletion vectors, column mapping
This release introduces read support for Delta tables utilizing deletion vectors and column mapping. It also includes performance improvements for table scans and predicate pushdown into Parquet.
This release adds support for multiple constraints at once, generates Symlink Manifests for external engines, and enables GCS auto-registration. It also includes fixes for schema evolution in merge operations, improved error reporting, and better handling of empty tables.
Week of Dec 29, 2025
1 releaseWeek of Dec 15, 2025
2 releasesThis release updates the dbt-core upper bound, enabling compatibility with dbt-core version 1.10.16. This allows Databricks users to leverage the latest dbt-core 1.10.x features and fixes.
Release: v2.10.4 (#1821)
- Update Databricks CLI to v0.280.0
Week of Dec 8, 2025
2 releasesUnityCatalog 0.3.1
The Unity Catalog Spark connector now supports automatic credential renewal for S3, Azure, and GCS, and introduces OAuth authentication for seamless token management. This release also adds experimental support for UC-managed Delta tables, where Unity Catalog coordinates storage and commits.
This release fixes an issue where default query tag values were not properly escaped or truncated. Databricks practitioners will now see correct handling of these values in their dbt projects.
Week of Sep 29, 2025
1 releaseWeek of Aug 25, 2025
2 releasesThe `create-missing-principals` functionality now handles exceptions when no UC roles are present, and you can optionally skip workflow assessment during installation. Group fetching and grant assertion retries have been improved for better consistency and reliability.
Release: v2.10.3 (#1772)
This release updates the Databricks CLI to v0.266.0, which includes breaking changes. Databricks practitioners should review the CLI release notes for details on these changes.
Week of Aug 4, 2025
1 releaseWeek of Jul 14, 2025
2 releasesUnityCatalog 0.3.0
Unity Catalog 0.3.0 now supports Spark 4.0 and Delta Lake 4.0, and introduces new API surfaces for credentials and external locations to enhance external storage handling. This release also includes a new Helm chart for Kubernetes deployments and fixes for time-travel on Delta tables.
Release: v2.10.2 (#1738)
- Update Databricks CLI to v0.259.0
Week of Jun 30, 2025
1 releaseWeek of Jun 23, 2025
1 releaseWeek of Jun 9, 2025
1 releaseWeek of May 26, 2025
1 releaseWeek of May 19, 2025
1 releaseWeek of May 5, 2025
2 releasesUCX now requires matching account groups to be created before assessment and clarifies Service Principal setup for installation. It also fixes table migration when a default catalog is set and pauses the migration progress workflow schedule by default.
Delta Lake 3.3.1
This release fixes an issue allowing user-specified schema on read if consistent with the table schema. It also includes a kernel fix for handling non-uniform value types in map[string, string] within Delta commit files.
Week of Apr 21, 2025
2 releasesWeek of Apr 14, 2025
1 releaseWeek of Mar 31, 2025
1 releaseWeek of Mar 10, 2025
1 releaseWeek of Mar 3, 2025
1 releaseWeek of Feb 24, 2025
2 releasesThis release introduces a new CLI command and documentation for migrating Delta Live Tables pipelines to Unity Catalog, including options to include or exclude specific pipelines. It also adds support for MSSQL and PostgreSQL databases to the Hive Metastore Federation feature, allowing these external metastores to be mirrored as Unity Catalog catalogs.
Unity Catalog AI 0.2.0
This release introduces new integrations for Gemini and LiteLLM, enabling Unity Catalog functions as tools for these models. The Databricks client now exclusively supports serverless endpoints, adds new APIs for function wrapping, and supports `requirements`, `environment_version`, and `Variant` types for improved dependency management and data handling.
Week of Jan 20, 2025
2 releasesThis release introduces new documentation for UCX, accessible at databrickslabs.github.io/ucx/. Internal release process security was enhanced by moving the release job to a protected hosted runner group.
UCX now supports Databricks Runtime 16+ for Hive Metastore table conversions and introduces a new `query_statement_disposition` option for SQL backend exports to handle large workspaces. Pipeline migration workflows are enhanced with `include_pipeline_ids` for more granular control, and a daily scheduled migration progress workflow is added.
Week of Jan 6, 2025
2 releasesDelta Lake 3.3.0
Delta Lake 3.3.0 introduces Identity Columns, faster VACUUM LITE, and the ability to enable Row Tracking on existing tables for row-level lineage. It also allows enabling UniForm Iceberg on existing tables without data rewrite and supports reading tables with Type Widening in Delta Kernel.
Unity Catalog AI 0.1.0
This initial release introduces Unity Catalog AI, providing a core client for managing and executing Unity Catalog functions as GenAI tools. It includes integration packages for popular AI frameworks like LangChain, LlamaIndex, OpenAI, Anthropic, CrewAI, and AutoGen, enabling seamless use of UC functions within these applications.