Skip to content

Commit

Permalink
feat: Add stream on view (#3150)
Browse files Browse the repository at this point in the history
<!-- Feel free to delete comments as you fill this in -->
- add `stream_on_view` resource
- add a new method for view helper client and table helper client and
use it in streams on views and streams on tables
- add a new doc helper for deprecating resources
- deprecate `stream` resource
<!-- summary of changes -->

## Test Plan
<!-- detail ways in which this PR has been tested or needs to be tested
-->
* [x] acceptance tests
<!-- add more below if you think they are relevant -->

## References
<!-- issues documentation links, etc  -->
https://docs.snowflake.com/en/sql-reference/sql/create-stream

## TODO
- rework stream data source
- add tests for detecting external change on type
  • Loading branch information
sfc-gh-jmichalak authored Oct 25, 2024
1 parent 649b839 commit 494af6d
Show file tree
Hide file tree
Showing 25 changed files with 1,821 additions and 15 deletions.
29 changes: 28 additions & 1 deletion MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Currently, resources like `snowflake_view`, `snowflake_stream_on_table`, `snowfl
Starting from this version, the provider detects stale streams for `snowflake_stream_on_table`, `snowflake_stream_on_external_table` and `snowflake_stream_on_directory_table` and recreates them (optionally with `copy_grants`) to recover them. To handle this correctly, a new computed-only field `stale` has been added to these resource, indicating whether a stream is stale.

### *(new feature)* snowflake_stream_on_directory_table resource
Continuing changes made in [v0.97](#v0960--v0970), the new resource `snowflake_stream_on_directory_table` has been introduced to replace the previous `snowflake_stream` for streams on directory tables.
Continuing changes made in [v0.97](#v0960--v0970), the new resource `snowflake_stream_on_directory_table` and `snowflake_stream_on_view` have been introduced to replace the previous `snowflake_stream` for streams on directory tables and streams on views.

To use the new `stream_on_directory_table`, change the old `stream` from
```terraform
Expand Down Expand Up @@ -45,6 +45,33 @@ resource "snowflake_stream_on_directory_table" "stream" {
}
```

To use the new `stream_on_view`, change the old `stream` from
```terraform
resource "snowflake_stream" "stream" {
name = "stream"
schema = "schema"
database = "database"
on_view = snowflake_view.view.fully_qualified_name
comment = "A stream."
}
```

to

```terraform
resource "snowflake_stream_on_view" "stream" {
name = "stream"
schema = "schema"
database = "database"
view = snowflake_view.view.fully_qualified_name
comment = "A stream."
}
```

Then, follow our [Resource migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md).

### *(new feature)* Secret resources
Expand Down
1 change: 1 addition & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,7 @@ The Snowflake provider will use the following order of precedence when determini
- [snowflake_oauth_integration](./docs/resources/oauth_integration)
- [snowflake_role](./docs/resources/role) - use [snowflake_account_role](./docs/resources/account_role) instead
- [snowflake_saml_integration](./docs/resources/saml_integration) - use [snowflake_saml2_integration](./docs/resources/saml2_integration) instead
- [snowflake_stream](./docs/resources/stream)

## Currently deprecated datasources

Expand Down
2 changes: 1 addition & 1 deletion docs/resources/stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ description: |-

# snowflake_stream (Resource)


~> **Deprecation** This resource is deprecated and will be removed in a future major version release. Please use one of the new resources instead: `snowflake_stream_on_directory_table` | `snowflake_stream_on_external_table` | `snowflake_stream_on_table` | `snowflake_stream_on_view` <deprecation>

## Example Usage

Expand Down
154 changes: 154 additions & 0 deletions docs/resources/stream_on_view.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@
---
page_title: "snowflake_stream_on_view Resource - terraform-provider-snowflake"
subcategory: ""
description: |-
Resource used to manage streams on views. For more information, check stream documentation https://docs.snowflake.com/en/sql-reference/sql/create-stream.
---

!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it.

# snowflake_stream_on_view (Resource)

Resource used to manage streams on views. For more information, check [stream documentation](https://docs.snowflake.com/en/sql-reference/sql/create-stream).

## Example Usage

```terraform
resource "snowflake_view" "view" {
database = "database"
schema = "schema"
name = "view"
statement = <<-SQL
select * from foo;
SQL
}
# basic resource
resource "snowflake_stream_on_view" "stream" {
name = "stream"
schema = "schema"
database = "database"
view = snowflake_view.view.fully_qualified_name
}
# resource with additional fields
resource "snowflake_stream_on_view" "stream" {
name = "stream"
schema = "schema"
database = "database"
copy_grants = true
view = snowflake_view.view.fully_qualified_name
append_only = "true"
show_initial_rows = "true"
at {
statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726"
}
comment = "A stream."
}
```
-> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources).
<!-- TODO(SNOW-1634854): include an example showing both methods-->

<!-- schema generated by tfplugindocs -->
## Schema

### Required

- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `view` (String) Specifies an identifier for the view the stream will monitor. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`

### Optional

- `append_only` (String) Specifies whether this is an append-only stream. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `at` (Block List, Max: 1) This field specifies that the request is inclusive of any changes made by a statement or transaction with a timestamp equal to the specified parameter. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--at))
- `before` (Block List, Max: 1) This field specifies that the request refers to a point immediately preceding the specified parameter. This point in time is just before the statement, identified by its query ID, is completed. Due to Snowflake limitations, the provider does not detect external changes on this field. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint". (see [below for nested schema](#nestedblock--before))
- `comment` (String) Specifies a comment for the stream.
- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream.
- `show_initial_rows` (String) Specifies whether to return all existing rows in the source table as row inserts the first time the stream is consumed. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value. External changes for this field won't be detected. In case you want to apply external changes, you can re-create the resource manually using "terraform taint".

### Read-Only

- `describe_output` (List of Object) Outputs the result of `DESCRIBE STREAM` for the given stream. (see [below for nested schema](#nestedatt--describe_output))
- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution).
- `id` (String) The ID of this resource.
- `show_output` (List of Object) Outputs the result of `SHOW STREAMS` for the given stream. (see [below for nested schema](#nestedatt--show_output))
- `stale` (Boolean) Indicated if the stream is stale. When Terraform detects that the stream is stale, the stream is recreated with `CREATE OR REPLACE`. Read more on stream staleness in Snowflake [docs](https://docs.snowflake.com/en/user-guide/streams-intro#data-retention-period-and-staleness).

<a id="nestedblock--at"></a>
### Nested Schema for `at`

Optional:

- `offset` (String) Specifies the difference in seconds from the current time to use for Time Travel, in the form -N where N can be an integer or arithmetic expression (e.g. -120 is 120 seconds, -30*60 is 1800 seconds or 30 minutes).
- `statement` (String) Specifies the query ID of a statement to use as the reference point for Time Travel. This parameter supports any statement of one of the following types: DML (e.g. INSERT, UPDATE, DELETE), TCL (BEGIN, COMMIT transaction), SELECT.
- `stream` (String) Specifies the identifier (i.e. name) for an existing stream on the queried table or view. The current offset in the stream is used as the AT point in time for returning change data for the source object.
- `timestamp` (String) Specifies an exact date and time to use for Time Travel. The value must be explicitly cast to a TIMESTAMP, TIMESTAMP_LTZ, TIMESTAMP_NTZ, or TIMESTAMP_TZ data type.


<a id="nestedblock--before"></a>
### Nested Schema for `before`

Optional:

- `offset` (String) Specifies the difference in seconds from the current time to use for Time Travel, in the form -N where N can be an integer or arithmetic expression (e.g. -120 is 120 seconds, -30*60 is 1800 seconds or 30 minutes).
- `statement` (String) Specifies the query ID of a statement to use as the reference point for Time Travel. This parameter supports any statement of one of the following types: DML (e.g. INSERT, UPDATE, DELETE), TCL (BEGIN, COMMIT transaction), SELECT.
- `stream` (String) Specifies the identifier (i.e. name) for an existing stream on the queried table or view. The current offset in the stream is used as the AT point in time for returning change data for the source object.
- `timestamp` (String) Specifies an exact date and time to use for Time Travel. The value must be explicitly cast to a TIMESTAMP, TIMESTAMP_LTZ, TIMESTAMP_NTZ, or TIMESTAMP_TZ data type.


<a id="nestedatt--describe_output"></a>
### Nested Schema for `describe_output`

Read-Only:

- `base_tables` (List of String)
- `comment` (String)
- `created_on` (String)
- `database_name` (String)
- `invalid_reason` (String)
- `mode` (String)
- `name` (String)
- `owner` (String)
- `owner_role_type` (String)
- `schema_name` (String)
- `source_type` (String)
- `stale` (Boolean)
- `stale_after` (String)
- `table_name` (String)
- `type` (String)


<a id="nestedatt--show_output"></a>
### Nested Schema for `show_output`

Read-Only:

- `base_tables` (List of String)
- `comment` (String)
- `created_on` (String)
- `database_name` (String)
- `invalid_reason` (String)
- `mode` (String)
- `name` (String)
- `owner` (String)
- `owner_role_type` (String)
- `schema_name` (String)
- `source_type` (String)
- `stale` (Boolean)
- `stale_after` (String)
- `table_name` (String)
- `type` (String)

## Import

Import is supported using the following syntax:

```shell
terraform import snowflake_stream_on_view.example '"<database_name>"."<schema_name>"."<stream_name>"'
```
1 change: 1 addition & 0 deletions examples/additional/deprecated_resources.MD
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,4 @@
- [snowflake_oauth_integration](./docs/resources/oauth_integration)
- [snowflake_role](./docs/resources/role) - use [snowflake_account_role](./docs/resources/account_role) instead
- [snowflake_saml_integration](./docs/resources/saml_integration) - use [snowflake_saml2_integration](./docs/resources/saml2_integration) instead
- [snowflake_stream](./docs/resources/stream)
1 change: 1 addition & 0 deletions examples/resources/snowflake_stream_on_view/import.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
terraform import snowflake_stream_on_view.example '"<database_name>"."<schema_name>"."<stream_name>"'
35 changes: 35 additions & 0 deletions examples/resources/snowflake_stream_on_view/resource.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
resource "snowflake_view" "view" {
database = "database"
schema = "schema"
name = "view"
statement = <<-SQL
select * from foo;
SQL
}

# basic resource
resource "snowflake_stream_on_view" "stream" {
name = "stream"
schema = "schema"
database = "database"

view = snowflake_view.view.fully_qualified_name
}

# resource with additional fields
resource "snowflake_stream_on_view" "stream" {
name = "stream"
schema = "schema"
database = "database"

copy_grants = true
view = snowflake_view.view.fully_qualified_name
append_only = "true"
show_initial_rows = "true"

at {
statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726"
}

comment = "A stream."
}
Original file line number Diff line number Diff line change
Expand Up @@ -93,4 +93,8 @@ var allResourceSchemaDefs = []ResourceSchemaDef{
name: "StreamOnDirectoryTable",
schema: resources.StreamOnDirectoryTable().Schema,
},
{
name: "StreamOnView",
schema: resources.StreamOnView().Schema,
},
}
Loading

0 comments on commit 494af6d

Please sign in to comment.