HashiConf Global Join us for HashiConf Global October 4-6 in Los Angeles & online. Register Now
  • Overview
    • Enforce Policy as Code
    • Infrastructure as Code
    • Inject Secrets into Terraform
    • Integrate with Existing Workflows
    • Manage Kubernetes
    • Manage Virtual Machine Images
    • Multi-Cloud Deployment
    • Network Infrastructure Automation
    • Terraform CLI
    • Terraform Cloud
    • Terraform Enterprise
  • Registry
  • Tutorials
    • About the Docs
    • Intro to Terraform
    • Configuration Language
    • Terraform CLI
    • Terraform Cloud
    • Terraform Enterprise
    • Provider Use
    • Plugin Development
    • Registry Publishing
    • Integration Program
    • Terraform Tools
    • CDK for Terraform
    • Glossary
  • Community
GitHub
Download
Try Terraform Cloud
    • v0.11.x (latest)
    • v0.10.x
    • v0.9.x
    • v0.8.x
    • v0.7.x

    Framework

  • Overview
  • Tutorial: Implement Create and Read
  • Provider Servers
  • Providers
    • Overview
    • Import
    • Plan Modification
    • State Upgrade
  • Data Sources
  • Schemas
  • Paths
  • Path Expressions
  • Attribute Types
  • Accessing State, Config, and Plan
  • Writing State
  • Returning Errors and Warnings
  • Validation
  • Acceptance Tests
  • Debugging
  • Other Plugin Docs

  • Plugin Development
  • SDKv2
  • Logging
  • Combining and Translating
Type '/' to Search

»Validation

The framework can return diagnostics feedback for values in provider, resource, and data source configurations. This allows you to write validations that give users feedback about required syntax, types, and acceptable values.

Note: When implementing validation logic, configuration values may be unknown based on the source of the value. Implementations must account for this case, typically by returning early without returning new diagnostics.

»Default Terraform CLI Validation

The Terraform configuration language is declarative and an implementation of HashiCorp Configuration Language (HCL). The Terraform CLI is responsible for reading and parsing configurations for validity, based on Terraform's concepts such as resource blocks and associated syntax. The Terraform CLI automatically handles basic validation of value type and behavior information based on the provider, resource, or data source schema. For example, the Terraform CLI returns an error when a string value is given where a list value is expected and also when a required attribute is missing from a configuration.

Terraform CLI syntax and basic schema checks occur during the terraform apply, terraform destroy, terraform plan, and terraform validate commands. Any additional validation you define with the framework occurs directly after these checks are complete.

»Attribute Validation

You can introduce validation on attributes using the generic framework-defined types such as types.String. To do this, supply the tfsdk.Attribute type Validators field with a list of validations, and the framework will return diagnostics from all validators. For example:

// Typically within the tfsdk.Schema returned by GetSchema() for a provider,
// resource, or data source.
tfsdk.Attribute{
    // ... other Attribute configuration ...

    Validators: []AttributeValidators{
        // These are example validators from terraform-plugin-framework-validators
        stringvalidator.LengthBetween(10, 256),
        stringvalidator.RegexMatches(
            regexp.MustCompile(`^[a-z0-9]+$`),
            "must contain only lowercase alphanumeric characters",
        ),
    },
}
// Typically within the tfsdk.Schema returned by GetSchema() for a provider,
// resource, or data source.
tfsdk.Attribute{
    // ... other Attribute configuration ...

    Validators: []AttributeValidators{
        // These are example validators from terraform-plugin-framework-validators
        stringvalidator.LengthBetween(10, 256),
        stringvalidator.RegexMatches(
            regexp.MustCompile(`^[a-z0-9]+$`),
            "must contain only lowercase alphanumeric characters",
        ),
    },
}

All validators in the slice will always be run, regardless of whether previous validators returned an error or not.

»Common Use Case Attribute Validators

You can implement attribute validators from the terraform-plugin-framework-validators Go module, which contains validation handling for many common use cases such as string contents and integer ranges.

»Creating Attribute Validators

If there is not an attribute validator in terraform-plugin-framework-validators that meets a specific use case, a provider-defined attribute validator can be created.

To create an attribute validator, you must implement the tfsdk.AttributeValidator interface. For example:

type stringLengthBetweenValidator struct {
    Max int
    Min int
}

// Description returns a plain text description of the validator's behavior, suitable for a practitioner to understand its impact.
func (v stringLengthBetweenValidator) Description(ctx context.Context) string {
    return fmt.Sprintf("string length must be between %d and %d", v.Min, v.Max)
}

// MarkdownDescription returns a markdown formatted description of the validator's behavior, suitable for a practitioner to understand its impact.
func (v stringLengthBetweenValidator) MarkdownDescription(ctx context.Context) string {
    return fmt.Sprintf("string length must be between `%d` and `%d`", v.Min, v.Max)
}

// Validate runs the main validation logic of the validator, reading configuration data out of `req` and updating `resp` with diagnostics.
func (v stringLengthBetweenValidator) Validate(ctx context.Context, req tfsdk.ValidateAttributeRequest, resp *tfsdk.ValidateAttributeResponse) {
    // types.String must be the attr.Value produced by the attr.Type in the schema for this attribute
    // for generic validators, use
    // https://pkg.go.dev/github.com/hashicorp/terraform-plugin-framework/tfsdk#ConvertValue
    // to convert into a known type.
    var str types.String
    diags := tfsdk.ValueAs(ctx, req.AttributeConfig, &str)
    resp.Diagnostics.Append(diags...)
    if diags.HasError() {
        return
    }

    if str.Unknown || str.Null {
        return
    }

    strLen := len(str.Value)

    if strLen < v.Min || strLen > v.Max {
        resp.Diagnostics.AddAttributeError(
            req.AttributePath,
            "Invalid String Length",
            fmt.Sprintf("String length must be between %d and %d, got: %d.", v.Min, v.Max, strLen),
        )

        return
    }
}
type stringLengthBetweenValidator struct {
    Max int
    Min int
}

// Description returns a plain text description of the validator's behavior, suitable for a practitioner to understand its impact.
func (v stringLengthBetweenValidator) Description(ctx context.Context) string {
    return fmt.Sprintf("string length must be between %d and %d", v.Min, v.Max)
}

// MarkdownDescription returns a markdown formatted description of the validator's behavior, suitable for a practitioner to understand its impact.
func (v stringLengthBetweenValidator) MarkdownDescription(ctx context.Context) string {
    return fmt.Sprintf("string length must be between `%d` and `%d`", v.Min, v.Max)
}

// Validate runs the main validation logic of the validator, reading configuration data out of `req` and updating `resp` with diagnostics.
func (v stringLengthBetweenValidator) Validate(ctx context.Context, req tfsdk.ValidateAttributeRequest, resp *tfsdk.ValidateAttributeResponse) {
    // types.String must be the attr.Value produced by the attr.Type in the schema for this attribute
    // for generic validators, use
    // https://pkg.go.dev/github.com/hashicorp/terraform-plugin-framework/tfsdk#ConvertValue
    // to convert into a known type.
    var str types.String
    diags := tfsdk.ValueAs(ctx, req.AttributeConfig, &str)
    resp.Diagnostics.Append(diags...)
    if diags.HasError() {
        return
    }

    if str.Unknown || str.Null {
        return
    }

    strLen := len(str.Value)

    if strLen < v.Min || strLen > v.Max {
        resp.Diagnostics.AddAttributeError(
            req.AttributePath,
            "Invalid String Length",
            fmt.Sprintf("String length must be between %d and %d, got: %d.", v.Min, v.Max, strLen),
        )

        return
    }
}

Optionally and depending on the complexity, it may be desirable to also create a helper function to instantiate the validator. For example:

func stringLengthBetween(minLength int, maxLength int) stringLengthBetweenValidator {
    return stringLengthBetweenValidator{
        Max: maxLength,
        Min: minLength,
    }
}
func stringLengthBetween(minLength int, maxLength int) stringLengthBetweenValidator {
    return stringLengthBetweenValidator{
        Max: maxLength,
        Min: minLength,
    }
}

»Path Based Attribute Validators

Attribute validators that need to accept paths to reference other attribute data should instead prefer path expressions. This allows consumers to use either absolute paths starting at the root of a schema, or relative paths based on the current attribute path where the validator is called.

Path expressions may represent one or more actual paths in the data. To find those paths, the process is called path matching. Depending on the actual data, a path match may return a parent path for null or unknown values, since any underlying paths of those null or unknown values would also represent the same value. This framework behavior is used to prevent false positives of returning no paths for null or unknown values.

The general structure for working with path expressions in an attribute validator is:

  • Merge the given path expression(s) with the current attribute path expression, such as calling the tfsdk.ValidateAttributeRequest type AttributePathExpression field MergeExpressions() method.
  • Loop through each merged path expression to get the matching paths within the data, such as calling the tfsdk.ValidateAttributeRequest type Config field PathMatches() method.
  • Loop through each matched path to get the generic attr.Value value, such as calling the tfsdk.ValidateAttributeRequest type Config field GetAttribute() method.
  • Perform null and unknown value checks on the attr.Value, such as the IsNull() method and IsUnknown() method.
  • If the attr.Value is not null and not unknown, then use tfsdk.ValueAs() using the expected value implementation as the target.

The following example shows a generic path based attribute validator that returns an error if types.Int64 values at the given path expressions are less than the current attribute types.Int64 value.

// Ensure our implementation satisfies the tfsdk.AttributeValidator interface.
var _ tfsdk.AttributeValidator = &int64IsGreaterThanValidator{}

// int64IsGreaterThanValidator is the underlying type implementing Int64IsGreaterThan.
type int64IsGreaterThanValidator struct {
    expressions path.Expressions
}

// Description returns a plaintext string describing the validator.
func (v int64IsGreaterThanValidator) Description(_ context.Context) string {
    return fmt.Sprintf("If configured, must be greater than %s attributes", v.expressions)
}

// MarkdownDescription returns a Markdown formatted string describing the validator.
func (v int64IsGreaterThanValidator) MarkdownDescription(ctx context.Context) string {
    return v.Description(ctx)
}

// Validate performs the validation logic for the validator.
func (v int64IsGreaterThanValidator) Validate(ctx context.Context, req tfsdk.ValidateAttributeRequest, resp *tfsdk.ValidateAttributeResponse) {
    // If the current attribute configuration is null or unknown, there
    // cannot be any value comparisons, so exit early without error.
    if req.AttributeConfig.IsNull() || req.AttributeConfig.IsUnknown() {
        return
    }

    // Convert the current attribute configuration to our expected attr.Value
    // implementation. This will raise an error if the validator is not put
    // on an attribute that is types.Int64Type.
    var attributeConfig types.Int64

    diags := tftypes.ValueAs(ctx, req.AttributeConfig, &attributeConfig)

    resp.Diagnostics.Append(diags...)

    // If the current attribute configuration is not valid, there cannot be
    // any value comparisons, so exit early.
    if resp.Diagnostics.HasError() {
        return
    }

    // Combine the given path expressions with the current attribute path
    // expression. This call automatically handles relative and absolute
    // expressions.
    expressions := req.AttributePathExpression.MergeExpressions(v.expressions...)

    // For each expression, find matching paths.
    for _, expression := range expressions {
        // Find paths matching the expression in the configuration data.
        matchedPaths, diags := req.Config.PathMatches(ctx, expression)

        resp.Diagnostics.Append(diags...)

        // Collect all errors
        if diags.HasError() {
            continue
        }

        // For each matched path, get the data and compare.
        for _, matchedPath := range matchedPaths {
            // Fetch the generic attr.Value at the given path. This ensures any
            // potential parent value of a different type, which can be a null
            // or unknown value, can be safely checked without raising a type
            // conversion error.
            var matchedPathValue attr.Value

            diags := req.Config.GetAttribute(ctx, matchedPath, &matchedPathValue)

            resp.Diagnostics.Append(diags...)

            // Collect all errors
            if diags.HasError() {
                continue
            }

            // If the matched path value is null or unknown, we cannot compare
            // values, so continue to other matched paths.
            if matchedPathValue.IsNull() || matchedPathValue.IsUnknown() {
                continue
            }

            // Now that we know the matched path value is not null or unknown,
            // it is safe to attempt converting it to the intended attr.Value
            // implementation, in this case a types.Int64 value.
            var matchedPathConfig types.Int64

            diags = tftypes.ValueAs(ctx, matchedPathValue, &matchedPathConfig)

            resp.Diagnostics.Append(diags...)

            // If the matched path value was not able to be converted from
            // attr.Value to the intended types.Int64 implementation, it most
            // likely means that the path expression was not pointing at a
            // types.Int64Type attribute. Collect the error and continue to
            // other matched paths.
            if diags.HasError() {
                continue
            }

            if matchedPathConfig.Value >= attributeConfig.Value {
                resp.Diagnostics.AddAttributeError(
                    matchedPath,
                    "Invalid Attribute Value",
                    fmt.Sprintf("Must be less than %s value: %d", req.AttributePath, attributeConfig.Value),
                )
            }
        }
    }
}

// Int64IsGreaterThan checks that any Int64 values in the paths described by the
// path.Expression are less than the current attribute value.
func Int64IsGreaterThan(expressions ...path.Expression) tfsdk.AttributeValidator {
    return &int64IsGreaterThanValidator{
        expressions: expressions,
    }
}
// Ensure our implementation satisfies the tfsdk.AttributeValidator interface.
var _ tfsdk.AttributeValidator = &int64IsGreaterThanValidator{}

// int64IsGreaterThanValidator is the underlying type implementing Int64IsGreaterThan.
type int64IsGreaterThanValidator struct {
    expressions path.Expressions
}

// Description returns a plaintext string describing the validator.
func (v int64IsGreaterThanValidator) Description(_ context.Context) string {
    return fmt.Sprintf("If configured, must be greater than %s attributes", v.expressions)
}

// MarkdownDescription returns a Markdown formatted string describing the validator.
func (v int64IsGreaterThanValidator) MarkdownDescription(ctx context.Context) string {
    return v.Description(ctx)
}

// Validate performs the validation logic for the validator.
func (v int64IsGreaterThanValidator) Validate(ctx context.Context, req tfsdk.ValidateAttributeRequest, resp *tfsdk.ValidateAttributeResponse) {
    // If the current attribute configuration is null or unknown, there
    // cannot be any value comparisons, so exit early without error.
    if req.AttributeConfig.IsNull() || req.AttributeConfig.IsUnknown() {
        return
    }

    // Convert the current attribute configuration to our expected attr.Value
    // implementation. This will raise an error if the validator is not put
    // on an attribute that is types.Int64Type.
    var attributeConfig types.Int64

    diags := tftypes.ValueAs(ctx, req.AttributeConfig, &attributeConfig)

    resp.Diagnostics.Append(diags...)

    // If the current attribute configuration is not valid, there cannot be
    // any value comparisons, so exit early.
    if resp.Diagnostics.HasError() {
        return
    }

    // Combine the given path expressions with the current attribute path
    // expression. This call automatically handles relative and absolute
    // expressions.
    expressions := req.AttributePathExpression.MergeExpressions(v.expressions...)

    // For each expression, find matching paths.
    for _, expression := range expressions {
        // Find paths matching the expression in the configuration data.
        matchedPaths, diags := req.Config.PathMatches(ctx, expression)

        resp.Diagnostics.Append(diags...)

        // Collect all errors
        if diags.HasError() {
            continue
        }

        // For each matched path, get the data and compare.
        for _, matchedPath := range matchedPaths {
            // Fetch the generic attr.Value at the given path. This ensures any
            // potential parent value of a different type, which can be a null
            // or unknown value, can be safely checked without raising a type
            // conversion error.
            var matchedPathValue attr.Value

            diags := req.Config.GetAttribute(ctx, matchedPath, &matchedPathValue)

            resp.Diagnostics.Append(diags...)

            // Collect all errors
            if diags.HasError() {
                continue
            }

            // If the matched path value is null or unknown, we cannot compare
            // values, so continue to other matched paths.
            if matchedPathValue.IsNull() || matchedPathValue.IsUnknown() {
                continue
            }

            // Now that we know the matched path value is not null or unknown,
            // it is safe to attempt converting it to the intended attr.Value
            // implementation, in this case a types.Int64 value.
            var matchedPathConfig types.Int64

            diags = tftypes.ValueAs(ctx, matchedPathValue, &matchedPathConfig)

            resp.Diagnostics.Append(diags...)

            // If the matched path value was not able to be converted from
            // attr.Value to the intended types.Int64 implementation, it most
            // likely means that the path expression was not pointing at a
            // types.Int64Type attribute. Collect the error and continue to
            // other matched paths.
            if diags.HasError() {
                continue
            }

            if matchedPathConfig.Value >= attributeConfig.Value {
                resp.Diagnostics.AddAttributeError(
                    matchedPath,
                    "Invalid Attribute Value",
                    fmt.Sprintf("Must be less than %s value: %d", req.AttributePath, attributeConfig.Value),
                )
            }
        }
    }
}

// Int64IsGreaterThan checks that any Int64 values in the paths described by the
// path.Expression are less than the current attribute value.
func Int64IsGreaterThan(expressions ...path.Expression) tfsdk.AttributeValidator {
    return &int64IsGreaterThanValidator{
        expressions: expressions,
    }
}

»Type Validation

You may want to create a custom type to simplify schemas if your provider contains common attribute values with consistent validation rules. When you implement validation on a type, you do not need to declare the same validation on the attribute, but you can supply additional validations in that manner. For example:

// Typically within the tfsdk.Schema returned by GetSchema() for a provider,
// resource, or data source.
tfsdk.Attribute{
    // ... other Attribute configuration ...

    // This is an example type which implements its own validation
    Type: computeInstanceIdentifierType,

    // This is optional, example validation that is checked in addition
    // to any validation performed by the type
    Validators: []AttributeValidators{
        stringvalidator.LengthBetween(10, 256),
    },
}
// Typically within the tfsdk.Schema returned by GetSchema() for a provider,
// resource, or data source.
tfsdk.Attribute{
    // ... other Attribute configuration ...

    // This is an example type which implements its own validation
    Type: computeInstanceIdentifierType,

    // This is optional, example validation that is checked in addition
    // to any validation performed by the type
    Validators: []AttributeValidators{
        stringvalidator.LengthBetween(10, 256),
    },
}

»Defining Type Validation

To support validation within a type, you must implement the xattr.TypeWithValidate interface. For example:

// Ensure type satisfies xattr.TypeWithValidate interface
var _ xattr.TypeWithValidate = computeInstanceIdentifierType{}

// Other methods to implement the attr.Type interface are omitted for brevity
type computeInstanceIdentifierType struct {}

func (t computeInstanceIdentifierType) Validate(ctx context.Context, tfValue tftypes.Value, path path.Path) diag.Diagnostics {
    var diags diag.Diagnostics

    if !tfValue.Type().Equal(tftypes.String) {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Expected String value, received %T with value: %v", in, in),
        )
        return diags
    }

    if !tfValue.IsKnown() || tfValue.IsNull() {
        return diags
    }

    var value string
    err := tfValue.As(&value)

    if err != nil {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Cannot convert value to string: %s", err),
        )
        return diags
    }

    if !strings.HasPrefix(value, "instance-") {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Missing `instance-` prefix, got: %s", value),
        )
        return diags
    }
}
// Ensure type satisfies xattr.TypeWithValidate interface
var _ xattr.TypeWithValidate = computeInstanceIdentifierType{}

// Other methods to implement the attr.Type interface are omitted for brevity
type computeInstanceIdentifierType struct {}

func (t computeInstanceIdentifierType) Validate(ctx context.Context, tfValue tftypes.Value, path path.Path) diag.Diagnostics {
    var diags diag.Diagnostics

    if !tfValue.Type().Equal(tftypes.String) {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Expected String value, received %T with value: %v", in, in),
        )
        return diags
    }

    if !tfValue.IsKnown() || tfValue.IsNull() {
        return diags
    }

    var value string
    err := tfValue.As(&value)

    if err != nil {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Cannot convert value to string: %s", err),
        )
        return diags
    }

    if !strings.HasPrefix(value, "instance-") {
        diags.AddAttributeError(
            path,
            "Compute Instance Type Validation Error",
            fmt.Sprintf("Missing `instance-` prefix, got: %s", value),
        )
        return diags
    }
}

»Schema Validation

Provider, resource, and data source schemas also support validation across all attributes. This is helpful when checking values in multiple attributes, such as ensuring the values are compatible with each other.

»Creating Provider Schema Validation

The framework performs provider validation in addition to attribute and type validation. You can implement either or both of the following interfaces.

The provider.ProviderWithConfigValidators interface follows a similar pattern to attribute validation and allows for a more declarative approach. This lets you write consistent validators across multiple providers. You must implement the provider.ConfigValidator interface for each validator. For example:

// Other methods to implement the provider.Provider interface are omitted for brevity
type exampleProvider struct {}

func (p exampleProvider) ConfigValidators(ctx context.Context) []provider.ConfigValidator {
    return []provider.ConfigValidator{
        /* ... */
    }
}
// Other methods to implement the provider.Provider interface are omitted for brevity
type exampleProvider struct {}

func (p exampleProvider) ConfigValidators(ctx context.Context) []provider.ConfigValidator {
    return []provider.ConfigValidator{
        /* ... */
    }
}

The provider.ProviderWithValidateConfig interface is more imperative in design and is useful for validating unique functionality that typically applies to a single provider. For example:

// Other methods to implement the provider.Provider interface are omitted for brevity
type exampleProvider struct {}

func (p exampleProvider) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}
// Other methods to implement the provider.Provider interface are omitted for brevity
type exampleProvider struct {}

func (p exampleProvider) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}

»Creating Resource Schema Validation

The framework performs resource schema validation in addition to any attribute and type validation. You can implement either or both of the following interfaces.

The resource.ResourceWithConfigValidators interface follows a similar pattern to attribute validation and allows for a more declarative approach. This lets you create consistent validators across multiple resources. You must implement the resource.ConfigValidator interface for each validator. For example:

// Other methods to implement the resource.Resource interface are omitted for brevity
type exampleResource struct {}

func (r exampleResource) ConfigValidators(ctx context.Context) []resource.ConfigValidator {
    return []resource.ConfigValidator{
        /* ... */
    }
}
// Other methods to implement the resource.Resource interface are omitted for brevity
type exampleResource struct {}

func (r exampleResource) ConfigValidators(ctx context.Context) []resource.ConfigValidator {
    return []resource.ConfigValidator{
        /* ... */
    }
}

The resource.ResourceWithValidateConfig interface is more imperative in design and is useful for validating unique functionality that typically applies to a single resource. For example:

// Other methods to implement the resource.Resource interface are omitted for brevity
type exampleResource struct {}

func (r exampleResource) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}
// Other methods to implement the resource.Resource interface are omitted for brevity
type exampleResource struct {}

func (r exampleResource) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}

»Creating Data Source Schema Validation

The framework performs data source schema validation in addition to any attribute and type validation. You can implement either or both of the following interfaces.

The datasource.DataSourceWithConfigValidators interface follows a similar pattern to attribute validation and allows for a more declarative approach. This lets you write consistent validators across multiple data sources. You must implement the datasource.ConfigValidator interface for each validator. For example:

// Other methods to implement the datasource.DataSource interface are omitted for brevity
type exampleDataSource struct {}

func (d exampleDataSource) ConfigValidators(ctx context.Context) []datasource.ConfigValidator {
    return []datasource.ConfigValidator{
        /* ... */
    }
}
// Other methods to implement the datasource.DataSource interface are omitted for brevity
type exampleDataSource struct {}

func (d exampleDataSource) ConfigValidators(ctx context.Context) []datasource.ConfigValidator {
    return []datasource.ConfigValidator{
        /* ... */
    }
}

The datasource.DataSourceWithValidateConfig interface is more imperative in design and is useful for validating unique functionality that typically applies to a single data source. For example:

// Other methods to implement the datasource.DataSource interface are omitted for brevity
type exampleDataSource struct {}

func (d exampleDataSource) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}
// Other methods to implement the datasource.DataSource interface are omitted for brevity
type exampleDataSource struct {}

func (d exampleDataSource) ValidateConfig(ctx context.Context, req ValidateConfigRequest, resp *ValidateConfigResponse) {
    // Retrieve values via req.Config.Get() or req.Config.GetAttribute(),
    // then return any warnings or errors via resp.Diagnostics.
}
github logoEdit this page
  • Overview
  • Docs
  • Extend
  • Privacy
  • Security
  • Press Kit
  • Consent Manager