-
Notifications
You must be signed in to change notification settings - Fork 480
Description
Configuration
resource "databricks_sql_endpoint" "custom" {
name = "custom_endpoint"
cluster_size = "2X-Small"
auto_stop_mins = 20
warehouse_type = "PRO"
}Expected Behavior
One of our team members manually changed max and min number of clusters to 4 through the UI (previously they were set to the default of 1 during creation by TF):
On the next run of Terraform, I would expect those to be set back to their default value.
Actual Behavior
On the next run of Terraform, max_num_clusters is set back to the default value of 1, but min_num_clusters is not:
# databricks_sql_endpoint.custom will be updated in-place
~ resource "databricks_sql_endpoint" "custom" {
id = "{endpoint_id}"
~ max_num_clusters = 4 -> 1
name = "custom_endpoint"
# (15 unchanged attributes hidden)
# (1 unchanged block hidden)
}Therefore, during TF apply we get an error, probably because we're not allowed to set max to be less than the min:
Error: cannot update sql endpoint: 1 is not a valid value for max_num_clusters. The value must be greater than or equal to 4, and less than or equal to 40.
When I add min_num_clusters into the config, everything works fine:
resource "databricks_sql_endpoint" "custom" {
name = "custom_endpoint"
cluster_size = "2X-Small"
min_num_clusters = 1
auto_stop_mins = 20
warehouse_type = "PRO"
}But I think it's unexpected that max_num_clusters and min_num_clusters work differently here.
Terraform and provider versions
$ terraform version
Terraform v1.14.3
on linux_amd64Provider version is v1.100.0