Skip to content

Commit

Permalink
Merge branch 'main' into resourceGrantsHyphens
Browse files Browse the repository at this point in the history
  • Loading branch information
ledbutter authored Feb 22, 2024
2 parents 49e2d1e + 9ec61f2 commit 88d1245
Show file tree
Hide file tree
Showing 81 changed files with 305 additions and 466 deletions.
6 changes: 4 additions & 2 deletions .github/ISSUE_TEMPLATE/docs-issue.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,12 @@ This template is for both adding enhancement as well as pointing out issues with
### Expected Details
<!-- What are you expecting from the section that has an issue. If the section is missing anything that should be expected please point that out. -->

### List of things to potentially add/remove:
### List of things to potentially add/remove

This is a list of things to manipulate in the docs:

- [ ] First item to change
- [ ] Second item to change
- [ ] Second item to change

### Important Factoids
<!-- Any links to external documentation that may prove your case, i.e Databricks public docs or Terraform public docs. -->
Expand Down
1 change: 0 additions & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,3 @@ How is this tested? Please see the checklist below and also describe any other r
- [ ] covered with integration tests in `internal/acceptance`
- [ ] relevant acceptance tests are passing
- [ ] using Go SDK

11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# Version changelog

## 1.37.1

### New Features and Improvements
* Removed `CustomizeDiff` and Client Side Validation for [databricks_grants](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grants) ([#3290](https://github.com/databricks/terraform-provider-databricks/pull/3290)).
* Added terraform support for restrict ws admins setting ([#3243](https://github.com/databricks/terraform-provider-databricks/pull/3243)).

### Internal Changes
* Migrated [databricks_global_init_script](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/global_init_script) to Go SDK ([#2036](https://github.com/databricks/terraform-provider-databricks/pull/2036)).
* Bump github.com/hashicorp/terraform-plugin-sdk/v2 from 2.31.0 to 2.32.0 ([#3177](https://github.com/databricks/terraform-provider-databricks/pull/3177)).


## 1.37.0

### New Features and Improvements
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ To make Databricks Terraform Provider generally available, we've moved it from [

You should have [`.terraform.lock.hcl`](https://github.com/databrickslabs/terraform-provider-databricks/blob/v0.6.2/scripts/versions-lock.hcl) file in your state directory that is checked into source control. terraform init will give you the following warning.

```
```text
Warning: Additional provider information from registry
The remote registry returned warnings for registry.terraform.io/databrickslabs/databricks:
Expand All @@ -178,6 +178,6 @@ After you replace `databrickslabs/databricks` with `databricks/databricks` in th

If you didn't check-in [`.terraform.lock.hcl`](https://www.terraform.io/language/files/dependency-lock#lock-file-location) to the source code version control, you may you may see `Failed to install provider` error. Please follow the simple steps described in the [troubleshooting guide](docs/guides/troubleshooting.md).

```
```text
Warning: Exporter is experimental and provided as is. It has an evolving interface, which may change or be removed in future versions of the provider.
```
1 change: 1 addition & 0 deletions catalog/permissions/permissions.go
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,7 @@ func (sm SecurableMapping) KeyValue(d attributeGetter) (string, string) {
}
return field, v
}
log.Printf("[WARN] Unexpected resource or permissions. Please proceed at your own risk.")
return "unknown", "unknown"
}
func (sm SecurableMapping) Id(d *schema.ResourceData) string {
Expand Down
184 changes: 5 additions & 179 deletions catalog/resource_grants.go
Original file line number Diff line number Diff line change
Expand Up @@ -89,171 +89,6 @@ func replaceAllPermissions(a permissions.UnityCatalogPermissionsAPI, securable s
})
}

type securableMapping map[string]map[string]bool

// reuse ResourceDiff and ResourceData
type attributeGetter interface {
Get(key string) any
}

func (sm securableMapping) kv(d attributeGetter) (string, string) {
for field := range sm {
v := d.Get(field).(string)
if v == "" {
continue
}
return field, v
}
return "unknown", "unknown"
}

func (sm securableMapping) id(d *schema.ResourceData) string {
securable, name := sm.kv(d)
return fmt.Sprintf("%s/%s", securable, name)
}

func (sm securableMapping) validate(d attributeGetter, pl PermissionsList) error {
securable, _ := sm.kv(d)
allowed, ok := sm[securable]
if !ok {
return fmt.Errorf(`%s is not fully supported yet`, securable)
}
for _, v := range pl.Assignments {
for _, priv := range v.Privileges {
if !allowed[strings.ToUpper(priv)] {
// check if user uses spaces instead of underscores
if allowed[strings.ReplaceAll(priv, " ", "_")] {
return fmt.Errorf(`%s is not allowed on %s. Did you mean %s?`, priv, securable, strings.ReplaceAll(priv, " ", "_"))
}
return fmt.Errorf(`%s is not allowed on %s`, priv, securable)
}
}
}
return nil
}

var mapping = securableMapping{
// add other securable mappings once needed
"table": {
"MODIFY": true,
"SELECT": true,

// v1.0
"ALL_PRIVILEGES": true,
"APPLY_TAG": true,
"BROWSE": true,
},
"catalog": {
"CREATE": true,
"USAGE": true,

// v1.0
"ALL_PRIVILEGES": true,
"APPLY_TAG": true,
"USE_CATALOG": true,
"USE_SCHEMA": true,
"CREATE_SCHEMA": true,
"CREATE_TABLE": true,
"CREATE_FUNCTION": true,
"CREATE_MATERIALIZED_VIEW": true,
"CREATE_MODEL": true,
"CREATE_VOLUME": true,
"READ_VOLUME": true,
"WRITE_VOLUME": true,
"EXECUTE": true,
"MODIFY": true,
"SELECT": true,
"REFRESH": true,
"BROWSE": true,
},
"schema": {
"CREATE": true,
"USAGE": true,

// v1.0
"ALL_PRIVILEGES": true,
"APPLY_TAG": true,
"USE_SCHEMA": true,
"CREATE_TABLE": true,
"CREATE_FUNCTION": true,
"CREATE_MATERIALIZED_VIEW": true,
"CREATE_MODEL": true,
"CREATE_VOLUME": true,
"READ_VOLUME": true,
"WRITE_VOLUME": true,
"EXECUTE": true,
"MODIFY": true,
"SELECT": true,
"REFRESH": true,
"BROWSE": true,
},
"storage_credential": {
"CREATE_TABLE": true,
"READ_FILES": true,
"WRITE_FILES": true,
"CREATE_EXTERNAL_LOCATION": true,

// v1.0
"ALL_PRIVILEGES": true,
"CREATE_EXTERNAL_TABLE": true,
},
"external_location": {
"CREATE_TABLE": true,
"READ_FILES": true,
"WRITE_FILES": true,

// v1.0
"ALL_PRIVILEGES": true,
"CREATE_EXTERNAL_TABLE": true,
"CREATE_MANAGED_STORAGE": true,
"CREATE_EXTERNAL_VOLUME": true,
"BROWSE": true,
},
"metastore": {
// v1.0
"CREATE_CATALOG": true,
"CREATE_CLEAN_ROOM": true,
"CREATE_CONNECTION": true,
"CREATE_EXTERNAL_LOCATION": true,
"CREATE_STORAGE_CREDENTIAL": true,
"CREATE_SHARE": true,
"CREATE_RECIPIENT": true,
"CREATE_PROVIDER": true,
"MANAGE_ALLOWLIST": true,
"USE_CONNECTION": true,
"USE_PROVIDER": true,
"USE_SHARE": true,
"USE_RECIPIENT": true,
"USE_MARKETPLACE_ASSETS": true,
"SET_SHARE_PERMISSION": true,
},
"function": {
"ALL_PRIVILEGES": true,
"EXECUTE": true,
},
"model": {
"ALL_PRIVILEGES": true,
"APPLY_TAG": true,
"EXECUTE": true,
},
"share": {
"SELECT": true,
},
"volume": {
"ALL_PRIVILEGES": true,
"READ_VOLUME": true,
"WRITE_VOLUME": true,
},
// avoid reserved field
"foreign_connection": {
"ALL_PRIVILEGES": true,
"CREATE_FOREIGN_CATALOG": true,
"CREATE_FOREIGN_SCHEMA": true,
"CREATE_FOREIGN_TABLE": true,
"USE_CONNECTION": true,
},
}

func (pl PermissionsList) toSdkPermissionsList() (out catalog.PermissionsList) {
for _, v := range pl.Assignments {
privileges := []catalog.Privilege{}
Expand Down Expand Up @@ -294,30 +129,21 @@ func ResourceGrants() common.Resource {
s := common.StructToSchema(PermissionsList{},
func(s map[string]*schema.Schema) map[string]*schema.Schema {
alof := []string{}
for field := range mapping {
for field := range permissions.Mappings {
s[field] = &schema.Schema{
Type: schema.TypeString,
ForceNew: true,
Optional: true,
}
alof = append(alof, field)
}
for field := range mapping {
for field := range permissions.Mappings {
s[field].AtLeastOneOf = alof
}
return s
})
return common.Resource{
Schema: s,
CustomizeDiff: func(ctx context.Context, d *schema.ResourceDiff) error {
if d.Id() == "" {
// unfortunately we cannot do validation before dependent resources exist with tfsdkv2
return nil
}
var grants PermissionsList
common.DiffToStructPointer(d, s, &grants)
return mapping.validate(d, grants)
},
Create: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
w, err := c.WorkspaceClient()
if err != nil {
Expand All @@ -329,17 +155,17 @@ func ResourceGrants() common.Resource {
}
var grants PermissionsList
common.DataToStructPointer(d, s, &grants)
securable, name := mapping.kv(d)
err = mapping.validate(d, grants)
err = mapping.validate(d, grants)

Check failure on line 158 in catalog/resource_grants.go

View workflow job for this annotation

GitHub Actions / compute_diff

undefined: mapping

Check failure on line 158 in catalog/resource_grants.go

View workflow job for this annotation

GitHub Actions / tests

undefined: mapping

Check failure on line 158 in catalog/resource_grants.go

View workflow job for this annotation

GitHub Actions / tests

undefined: mapping
if err != nil {
return err
}
securable, name := permissions.Mappings.KeyValue(d)
unityCatalogPermissionsAPI := permissions.NewUnityCatalogPermissionsAPI(ctx, c)
err = replaceAllPermissions(unityCatalogPermissionsAPI, securable, name, grants.toSdkPermissionsList())
if err != nil {
return err
}
d.SetId(mapping.id(d))
d.SetId(permissions.Mappings.Id(d))
return nil
},
Read: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
Expand Down
49 changes: 0 additions & 49 deletions catalog/resource_grants_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -358,31 +358,6 @@ func TestGrantReadMalformedId(t *testing.T) {
}.ExpectError(t, "ID must be two elements split by `/`: foo.bar")
}

type data map[string]string

func (a data) Get(k string) any {
return a[k]
}

func TestMappingUnsupported(t *testing.T) {
d := data{"nothing": "here"}
err := mapping.validate(d, PermissionsList{})
assert.EqualError(t, err, "unknown is not fully supported yet")
}

func TestInvalidPrivilege(t *testing.T) {
d := data{"table": "me"}
err := mapping.validate(d, PermissionsList{
Assignments: []PrivilegeAssignment{
{
Principal: "me",
Privileges: []string{"EVERYTHING"},
},
},
})
assert.EqualError(t, err, "EVERYTHING is not allowed on table")
}

func TestPermissionsList_Diff_ExternallyAddedPrincipal(t *testing.T) {
diff := diffPermissions(
catalog.PermissionsList{ // config
Expand Down Expand Up @@ -600,30 +575,6 @@ func TestShareGrantUpdate(t *testing.T) {
}.ApplyNoError(t)
}

func TestPrivilegeWithSpace(t *testing.T) {
d := data{"table": "me"}
err := mapping.validate(d, PermissionsList{
Assignments: []PrivilegeAssignment{
{
Principal: "me",
Privileges: []string{"ALL PRIVILEGES"},
},
},
})
assert.EqualError(t, err, "ALL PRIVILEGES is not allowed on table. Did you mean ALL_PRIVILEGES?")

d = data{"external_location": "me"}
err = mapping.validate(d, PermissionsList{
Assignments: []PrivilegeAssignment{
{
Principal: "me",
Privileges: []string{"CREATE TABLE"},
},
},
})
assert.EqualError(t, err, "CREATE TABLE is not allowed on external_location. Did you mean CREATE_TABLE?")
}

func TestConnectionGrantCreate(t *testing.T) {
qa.ResourceFixture{
Fixtures: []qa.HTTPFixture{
Expand Down
2 changes: 1 addition & 1 deletion common/version.go
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ package common
import "context"

var (
version = "1.37.0"
version = "1.37.1"
// ResourceName is resource name without databricks_ prefix
ResourceName contextKey = 1
// Provider is the current instance of provider
Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/aws_bucket_policy.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,6 @@ In addition to all arguments above, the following attributes are exported:
The following resources are used in the same context:

* [Provisioning AWS Databricks E2 with a Hub & Spoke firewall for data exfiltration protection](../guides/aws-e2-firewall-hub-and-spoke.md) guide.
* [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide
* [End to end workspace management](../guides/workspace-management.md) guide
* [databricks_instance_profile](../resources/instance_profile.md) to manage AWS EC2 instance profiles that users can launch [databricks_cluster](../resources/cluster.md) and access data, like [databricks_mount](../resources/mount.md).
* [databricks_mount](../resources/mount.md) to [mount your cloud storage](https://docs.databricks.com/data/databricks-file-system.html#mount-object-storage-to-dbfs) on `dbfs:/mnt/name`.
2 changes: 1 addition & 1 deletion docs/data-sources/cluster.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ This data source exports the following attributes:

The following resources are often used in the same context:

* [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide.
* [End to end workspace management](../guides/workspace-management.md) guide.
* [databricks_cluster](../resources/cluster.md) to create [Databricks Clusters](https://docs.databricks.com/clusters/index.html).
* [databricks_cluster_policy](../resources/cluster_policy.md) to create a [databricks_cluster](../resources/cluster.md) policy, which limits the ability to create clusters based on a set of rules.
* [databricks_instance_pool](../resources/instance_pool.md) to manage [instance pools](https://docs.databricks.com/clusters/instance-pools/index.html) to reduce [cluster](../resources/cluster.md) start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
Expand Down
4 changes: 2 additions & 2 deletions docs/data-sources/clusters.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,10 @@ This data source exports the following attributes:

The following resources are used in the same context:

* [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide.
* [End to end workspace management](../guides/workspace-management.md) guide.
* [databricks_cluster](../resources/cluster.md) to create [Databricks Clusters](https://docs.databricks.com/clusters/index.html).
* [databricks_cluster_policy](../resources/cluster_policy.md) to create a [databricks_cluster](../resources/cluster.md) policy, which limits the ability to create clusters based on a set of rules.
* [databricks_instance_pool](../resources/instance_pool.md) to manage [instance pools](https://docs.databricks.com/clusters/instance-pools/index.html) to reduce [cluster](../resources/cluster.md) start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
* [databricks_job](../resources/job.md) to manage [Databricks Jobs](https://docs.databricks.com/jobs.html) to run non-interactive code in a [databricks_cluster](../resources/cluster.md).
* [databricks_library](../resources/library.md) to install a [library](https://docs.databricks.com/libraries/index.html) on [databricks_cluster](../resources/cluster.md).
* [databricks_pipeline](../resources/pipeline.md) to deploy [Delta Live Tables](https://docs.databricks.com/data-engineering/delta-live-tables/index.html).
* [databricks_pipeline](../resources/pipeline.md) to deploy [Delta Live Tables](https://docs.databricks.com/data-engineering/delta-live-tables/index.html).
2 changes: 1 addition & 1 deletion docs/data-sources/current_config.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Data source exposes the following attributes:

The following resources are used in the same context:

* [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide
* [End to end workspace management](../guides/workspace-management.md) guide
* [databricks_directory](../resources/directory.md) to manage directories in [Databricks Workpace](https://docs.databricks.com/workspace/workspace-objects.html).
* [databricks_notebook](../resources/notebook.md) to manage [Databricks Notebooks](https://docs.databricks.com/notebooks/index.html).
* [databricks_repo](../resources/repo.md) to manage [Databricks Repos](https://docs.databricks.com/repos.html).
Loading

0 comments on commit 88d1245

Please sign in to comment.