-
Notifications
You must be signed in to change notification settings - Fork 125
Add support for configurable catalog/schema for dashboards #4130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for configurable catalog/schema for dashboards #4130
Conversation
| RecordRequests = true | ||
|
|
||
| [EnvMatrix] | ||
| DATABRICKS_BUNDLE_ENGINE = ["direct"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add a comment re terraform, why is it not tested there? Is it not implemented there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey! the TF changes are in but have not been released yet. was chatting with @pietern who mentioned that we were unblocked for the direct deployment mode and we can implement the changes there first.
ill add a comment in the test
| WarehouseId: dashboard.WarehouseId, | ||
| SerializedDashboard: dashboard.SerializedDashboard, | ||
| ParentPath: ensureWorkspacePrefix(dashboard.ParentPath), | ||
| DatasetCatalog: "", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it not possible to read these?
In general, we need to be able to read these fields to identify whether they were modified outside of DABs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
APIs which have a behaviour like this are likely not complaint with AIP.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I heard back from @pietern that we detect diff here via etags since the etag changes if these are modified. Can you please add a comment saying as much so future readers are also aware?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good call, done 👍
- Add Terraform conversion test for dataset_catalog and dataset_schema - Update acceptance tests to run for both direct and terraform engines - Fix direct engine to skip write-only fields in remote diff computation - Add engine-specific expected output files for requests and plans - Update test script to generate engine-specific request files
|
An authorized user can trigger integration tests manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
|
Commit: ca21bfe
23 interesting tests: 20 KNOWN, 2 FAIL, 1 SKIP
Top 38 slowest tests (at least 2 minutes):
|
|
Integration test failures are unrelated. |
|
Commit: b6349a4
39 interesting tests: 20 KNOWN, 16 FAIL, 2 flaky, 1 SKIP
Top 50 slowest tests (at least 2 minutes):
|
## Release v0.281.0 ### CLI * Fix lakeview publish to default `embed_credentials` to false ([#4066](#4066)) ### Bundles * Add support for configurable catalog/schema for dashboards ([#4130](#4130)) * Pass SYSTEM\_ACCESSTOKEN from env to the Terraform provider ([#4135](#4135)) * `bundle deployment migrate`: when running `bundle plan` propagate `-var` arguments. * engine/direct: New option --plan to `bundle deploy` to deploy previously saved plan (saved plan with `bundle plan -o json`) ([#4134](#4134)) * engine/direct: Fix dependency-ordered deletion by persisting depends\_on in state ([#4105](#4105)) ### Dependency updates * Upgrade Go SDK to 0.94.0 ([#4148](#4148)) * Upgrade Terraform provider to 1.100.0 ([#4150](#4150))
|
Hi @leonli33-databricks , nice feature! My problem is, that it doesn't work using databricks CLI v0.282.0 on Mac M4. After My databricks.yml looks like this: resources:
jobs:
...
tasks:
...
dashboards:
dashboard:
display_name: "Great Dashboard"
dataset_catalog: "catalog"
dataset_schema: "schema"How to analyze/fix? Any kind of support is highly appreciated! Best Regards! |
|
hey @hafeja! the error is saying that the default catalog and schema you used does not have the asset
also may be quicker to do this on slack, feel free to message me (Leon Li) |
|
Hi @leonli33-databricks, due to data protection, I cannot share the exact values. Dataset query excerpt: As soon as Is there any way to debug the issue? Any other ideas? In the output of Another strange behavior is that some VS Code linter complains (red line): |


Changes
Support passing the
dataset_cataloganddataset_schemafields for dashboards.Example:
Tests
New acceptance tests.