Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hclwrite: add support for tuple,nested object and function call #502

Merged
merged 1 commit into from
Jan 4, 2022

Conversation

incubator4
Copy link
Contributor

Add several function for SetAttributeRaw() enhancement, which include ways transfer Traverse to Tokens

TokensForListTraversal support for list traversal, mentioned in #347
struct like this

// block for
func main() {
	file := NewEmptyFile()
	body := file.Body()
	block := body.AppendNewBlock("resource", []string{"foo"})
	tokens := TokensForListTraversal([]hcl.Traversal{
		{
			hcl.TraverseRoot{Name: "root"},
			hcl.TraverseAttr{Name: "attr"},
		},
		{
			hcl.TraverseRoot{Name: "second"},
			hcl.TraverseAttr{Name: "attr"},
		},
	})
	block.Body().SetAttributeRaw("bar", tokens)
	fmt.Printf("%s", file.Bytes())
}

and got

resource "foo" {
  bar = [root.attr,second.attr]
}

TokensForListTokens for mixed traversal value and cty value, trans all traversal and cty value into Tokens , then create a list.
struct like this

// block for
func main() {
	file := NewEmptyFile()
	body := file.Body()
	block := body.AppendNewBlock("resource", []string{"foo"})
	tokens := TokensForListTokens([]Tokens{
		TokensForTraversal(hcl.Traversal{
		hcl.TraverseRoot{Name: "root"},
		hcl.TraverseAttr{Name: "attr"},
		}),
		TokensForValue(cty.StringVal("baz")),
	})
	block.Body().SetAttributeRaw("bar", tokens)
	fmt.Printf("%s", file.Bytes())
}

and got

resource "foo" {
  bar = [root.attr,"baz"]
}

TokensForObject for nest struct of any value, including ctx value,traversal, list traversal, even another object

// block for
func main() {
	file := NewEmptyFile()
	body := file.Body()
	block := body.AppendNewBlock("resource", []string{"foo"})
	tokens := TokensForObject(map[string]Tokens{
		"x": TokensForTraversal(hcl.Traversal{
			hcl.TraverseRoot{Name: "root"},
			hcl.TraverseAttr{Name: "attr"},
		}),
		"y": TokensForValue(cty.StringVal("abc")),
		"z": TokensForListTraversal([]hcl.Traversal{
			{
				hcl.TraverseRoot{Name: "env"},
				hcl.TraverseAttr{Name: "PATH"},
			},
		}),
	})
	block.Body().SetAttributeRaw("bar", tokens)
	fmt.Printf("%s", file.Bytes())
}

and got

resource "foo" {
  bar = {
    x = root.attr
    y = "abc"
    z = [env.Path]
  }
}

@hashicorp-cla
Copy link

hashicorp-cla commented Dec 22, 2021

CLA assistant check
All committers have signed the CLA.

@incubator4 incubator4 changed the title hclwrite add support for list of traversals and object hclwrite: add support for list of traversals and nested object Dec 22, 2021
@apparentlymart
Copy link
Contributor

apparentlymart commented Dec 22, 2021

Hi @incubator4! Thanks for working on this.

It does seem reasonable to me to expand the set of functions we have to help construct common patterns of tokens. An interesting thing about tuple constructor and object constructor expressions (which is what we call the [ ... ] and { ... } syntaxes in the spec and elsewhere in the implementation) is that the items inside can be any valid expression tokens in principle, and so I like the design of your TokensForListTokens and TokensForObject where the elements are given as Tokens values themselves and so it's fully general.

With those in place, think TokensForListTraversal feels redundant given that you can get the same result by passing TokensForTraversal into TokensForListTokens anyway.

Given that, I'd like to simplify this a little and also refine some of the terminology to match how we talk about these constructs in the relevant section of the native syntax spec:

  • func TokensForTuple(elems []Tokens) Tokens
  • func TokensForObject(attrs []ObjectAttrTokens) Tokens
  • type ObjectAttrTokens struct {
        Name  Tokens
        Value Tokens
    }
  • func TokensForIdentifier(name string) Tokens

With the above signatures I think this can cover anything that could be achieved using the Static List and Static Map operations, which calling applications can use in order to inspect the static syntax directly rather than producing a real value.

Terraform's providers argument inside module blocks uses the "Static Map" parsing mode to allow a mapping from provider configuration traversal to provider configuration traversal, thus avoiding the usual requirement that the map "keys" must be strings, and so the generalized TokensForObject API I described above allows generating that syntax:

// Might generate the following, assuming particular values for
// callerProviderConfig and calleeProviderConfig:
//
//    providers = {
//      aws.foo = aws.bar
//    }
body.SetAttributeRaw("providers", hclwrite.TokensForObject([]hclwrite.ObjectAttrTokens{
    {
        Name:  hclwrite.TokensForTraversal(callerProviderConfig),
        Value: hclwrite.TokensForTraversal(calleeProviderConfig),
    },
}))

I added TokensForIdentifier into the mix above imagining it just as a shorthand for using TokensForTraversal with a single-step traversal, which is symmetrical with the corresponding decoding API where we have the more general AbsTraversalForExpr and also the specific ExprAsKeyword as an easier API for the common case of only a single identifier. That's an optional thing we could add later so not super urgent to include now, but I think it could help with populating the Name field of ObjectAttrTokens in simpler cases like the one you showed in your original writeup, where the keys are all just simple identifiers.

(The object vs. map terminology and identifier vs. keyword terminology is a bit funny here in the spec because there's a blend of syntax-specific terminology vs. the syntax-agnostic infoset terms which the native syntax spec is refining. That inconsistency unfortunately does end up showing up in the API here too, but I think it's better for the API to be consistent with the spec even if the spec itself is a little inconsistent in its terminology. hclsyntax is specifically implementing the Native Syntax specification, so I'd like it to follow that specification's terminology in particular.)

I hope all of that seems reasonable! Aside from the API feedback, my only other main feedback here would be to make sure that the tokens generated for tuple and object constructors here follow the same newline-inclusion conventions as the corresponding behavior for tuples and objects in TokensForValue, so that it'll be exactly equivalent to call TokensForValue with a tuple value vs. calling TokensForTuple with a slice of TokensForValue calls representing the tuple elements. I think that's already true in what you implemented, so I'm not intending this as a critique but just to be clear about the design intent and maybe we could include a test for that specifically to make sure it stays true under future maintenance.

Thanks again!


If you feel adventurous here it might be nice to complete the set of static-analysis-supporting constructs (which are the situations where TokensForValue is least likely to be appropriate) by adding one more construct:

  • func TokensForFunctionCall(name string, args ...Tokens) Tokens
  • func TokensForFunctionCallExpandArg(arg Tokens) Tokens

I'm imagining TokensForFunctionCallExpandArg essentially just adding the ellipsis token ... on the end of the given tokens to create the optional "expansion" syntax, but in normal case it'd be fine to just pass expression-generating tokens directly as the args, allowing to reproduce the input variable type constraint syntax that Terraform uses, which is based on function calls and identifiers:

// Should generate:
//     type = list(string)
body.SetAttributeRaw("type", hclwrite.TokensForFunctionCall(
    "list",
    hclwrite.TokensForIdentifier("string"),
)

I don't mean to suggest that you must add this in order for this PR to be accepted, since of course we can always add new functionality like this later, but I wanted to mention it because it seems like you have an underlying Terraform-specific use-case and so maybe it would help you to have support for the full repertiore of HCL static analysis features that the Terraform language relies on.


One day I would like to also have a similar set of functions in the family that currently includes and NewExpressionLiteral and NewExpressionAbsTraversal, which provide similar combinators for constructing tuple, object, and call syntax in terms of *Expression rather than raw Tokens, but that's definitely considerably more work than just the token-generation functions and so best to save that for another PR, I think. I mention it only because it was part of my design thinking of proposing keeping only the general combinators and excluding TokensForListTraversal, since that way we can minimize the number of functions that would be symmetrical across both the TokensFor... and the NewExpression... constructor families.

@incubator4
Copy link
Contributor Author

@apparentlymart I finished most of the changes except TokensForFunctionCallExpandArg, which I think I may have some misunderstanding, so it was ignored

@incubator4 incubator4 changed the title hclwrite: add support for list of traversals and nested object hclwrite: add support for tuple,nested object and function call Dec 23, 2021
The "TokensFor..." family of functions all aim to construct valid
raw token sequences representing particular syntax constructs.

Previously we had only "leaf" functions TokensForValue and
TokensForTraversal, but nothing to help with constructing compound
structures.

Here we add TokensForTuple, TokensForObject, and
TokensForFunctionCall which together cover all of the constructs
that HCL allows static analysis of, and thus constructs where it's
likely that someone would want to generate an expression that
is interpreted purely by its syntax and not resolved into a value.

What all of these have in common is that they take other Tokens
values as arguments and include them verbatim as part of their
result, with the caller being responsible for making sure these
smaller units are themselves valid expression tokens.

This also adds TokensForIdentifier as a convenient shorthand for
generating single-identifier tokens, which is particularly useful
for populating the attribute names in TokensForObject.
Copy link
Contributor

@apparentlymart apparentlymart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this, @incubator4!

I made some small amendments to your commits to add a few more test cases and some longer docstring comments to the new functions, just because it seemed more efficient than doing another code review round-trip, but this looks great otherwise, and I'm going to merge it now for inclusion in the next HCL release.

(We don't really have a regular release schedule for HCL, so I can't promise it'll be tagged at any particular time in the future, but we typically create a new tag after there are a few new features accumulated, or when there's an important bug to fix. Until then, you should be able to use HCL from the main branch to get this new functionality immediately.)

@apparentlymart apparentlymart merged commit ff690b9 into hashicorp:main Jan 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants