Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation for linking Stacks #36408

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion website/data/language-nav-data.json
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,8 @@
"routes": [
{ "title": "Define configuration", "path": "stacks/deploy/config" },
{ "title": "Set conditions for deployment plans", "path": "stacks/deploy/conditions" },
{ "title": "Authenticate a Stack", "path": "stacks/deploy/authenticate" }
{ "title": "Authenticate a Stack", "path": "stacks/deploy/authenticate" },
{ "title": "Pass data from one Stack to another", "path": "stacks/deploy/link-stacks" }
]
},
{
Expand Down
114 changes: 114 additions & 0 deletions website/docs/language/stacks/deploy/link-stacks.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
---
page_title: Pass data from one Stack to another
description: Learn how to pass data from one Stack to another using `publish_output` blocks to output data from one Stack, and `upstream_input` blocks to input that data into another Stack.
---

# Pass data from one Stack to another

If you have multiple Stacks that do not share a provisioning lifecycle, there are situations where you want to pass information between Stacks. To export data from one Stack to another, use a `publish_output` block to output data from one Stack, and use an `upstream_input` block in another Stack to consume that output.

If the output value of a Stack changes after a run, HCP Terraform automatically triggers runs for any Stacks that depend on those outputs.

## Background

To output information from a Stack, declare a `publish_output` block in the deployment configuration of the Stack exporting data. We refer to the Stack that declares a `publish_output` block as the upstream Stack.

To use another Stack's output, declare an `upstream_input` block in the deployment configuration of a different Stack in the same project. We refer to the Stack that declares an `upstream_input` block as the downstream Stack. For example, if Stack A produces outputs that Stack B depends on, Stack A is the upstream Stack, and Stack B is the downstream Stack.

As a real life example, you could have a Stack for shared services, such as networking infrastructure, and a separate Stack for application components. Your Stack separation allows you to manage each Stack independently, and you can export data from your networking Stack with the `publish_output` block and consume that data into your application Stack using the `upstream_input` block.

# Requirements

The `publish_output` and `upstream_input` blocks require at least Terraform version `terraform_1.10.0-alpha20241009` or higher. We recommend downloading the [latest version of Terraform](https://releases.hashicorp.com/terraform/) to use the most up-to-date functionality.

Downstream Stacks must also reside in the same project as their upstream Stacks.

# Declare outputs

You must declare a `publish_output` block in your deployment configuration for each value you want to output from your current Stack.

Once you apply a Stack configuration version that includes your `publish_output` block, HCP Terraform publishes a snapshot of those values, which allows HCP Terraform to resolve them. Meaning, you must apply your Stack’s deployment configuration before any downstream Stacks can reference your Stack's outputs.

For example, you can add a `publish_output` block for the `vpc_id` in your upstream Stack’s deployment configuration.

<CodeBlockConfig filename="network.tfdeploy.hcl">

```hcl
# Networking Stack deployment configuration

publish_output "vpc_id" {
description = "The networking Stack's VPC's ID."
# You can directly reference a deployment's values with the
# deployment.deployment_name syntax
value = deployment.network.vpc_id
}
```

</CodeBlockConfig>

After applying this configuration, any Stack in the same project can now reference this `vpc_id` output by declaring an `upstream_input` block. Learn more about the [`publish_output` block](/terraform/language/stacks/reference/tfdeploy#publish_output-block-configuration).

## Use an upstream Stack’s inputs

Declare an `upstream_input` block in your Stack’s deployment configuration to read values from another Stack's `publish_output` block. Adding an `upstream_input` block creates a dependency on the upstream Stack.

For example, if you want to use the output `vpc_id` from an upstream Stack in the same project, declare an `upstream_input` block in your deployment configuration.

<CodeBlockConfig filename="application.tfdeploy.hcl">

```hcl
# Application Stack deployment configuration

upstream_input "networking_stack" {
type = "Stack"
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
}

deployment "application" {
inputs = {
# This Stack depends on the networking Stack for this value
vpc_id = upstream_input.network_stack.vpc_id
}
}
```

</CodeBlockConfig>

After pushing your Stack's configuration into HCP Terraform, HCP Terraform searches for the most recently published snapshot of the upstream Stack your configuration references. If no snapshot exists, the downstream Stack's run fails.

If HCP Terraform finds a published snapshot for your referenced upstream Stack, then all of that Stack's outputs are available to this downstream Stack. Add `upstream_input` blocks for every upstream Stack you want to reference. Learn more about the [`upstream_input` block](/terraform/language/stacks/reference/tfdeploy#upstream_input-block-configuration).

To stop depending on an upstream Stack’s outputs, do the following in your downstream Stack's deployment configuration:

- Remove the upstream Stack's `upstream_input` block
- Remove any references to the upstream Stack's outputs
- Push your configuration changes to HCP Terraform and apply the new configuration

## Trigger runs when output values change

If an upstream Stack's published output values change, HCP Terraform automatically triggers runs for any downstream Stacks that rely on those outputs.

For example, if your upstream networking Stack’s output changes, HCP Terraform triggers a new plan for the downstream Stacks that reference that output.

<CodeBlockConfig filename="application.tfdeploy.hcl">

```hcl
# Application Stack deployment configuration

upstream_input "network_stack" {
type = "Stack"
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
}

deployment "application" {
inputs = {
# This Stack depends on the networking Stack’s output, so if
# the vpc_id changes then HCP Terraform triggers a new run for this Stack.
vpc_id = upstream_input.network_stack.vpc_id
}
}
```

</CodeBlockConfig>

This approach allows you to decouple Stacks that don’t share a lifecycle, while also ensuring that updates in an upstream Stack ripple out to any downstream Stacks.
4 changes: 3 additions & 1 deletion website/docs/language/stacks/design.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,13 @@ Before writing your Stack configuration, we recommend assessing your current inf

Each Stack should represent a single system or application with a shared lifecycle. Start by analyzing your current infrastructure and identifying the components HCP Terraform should manage together. Components are typically groups of related resources, such as an application’s backend, frontend, or database layer, deployed and scaled together.

We recommend structuring your Stacks along technical boundaries to keep them modular and manageable. For example, you can create a dedicated Stack for shared services, such as networking infrastructure for VPCs, subnets, or routing tables, and separate Stacks for application components that consume those shared services. This separation allows you to manage shared services independently while passing information between Stacks. For more details, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).

### Sketch out your configuration

We recommend sticking to technical boundaries when structuring a Stack configuration. A single Stack should represent a single system with a shared lifecycle.

We recommend keeping a Stack as self-contained as possible. However, there are valid cases where outputs from one Stack, like a shared VPC networking service Stack, may need to pass inputs into another Stack, like an application Stack. If there’s a well-defined interface between two parts of your infrastructure, it makes sense to model them as separate Stacks.
While keeping a Stack as self-contained as possible is ideal, there are valid cases where a Stack must consume outputs from another Stack. For example, a shared networking Stack can publish outputs like vpc_id or subnet IDs, which downstream application Stacks can consume as inputs. This approach ensures modularity and allows you to manage dependencies between Stacks using well-defined interfaces. For more details, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).

Plan to add a component block to your configuration for every top-level module you want to include in your Stack. After establishing your top-level modules, you can use child modules without adding additional `component` blocks.

Expand Down
108 changes: 108 additions & 0 deletions website/docs/language/stacks/reference/tfdeploy.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -312,3 +312,111 @@ You can access specific environment variables by key from the `store.varset.avai
## `locals` block configuration

A local value assigns a name to an expression, so you can use the name multiple times within your Stack configuration instead of repeating the expression. The `locals` block works exactly as it does in traditional Terraform configurations. Learn more about [the `locals` block](/terraform/language/values/locals).

## `publish_output` block configuration

-> **Note**: To use the `publish_output` block requires at least Terraform version `terraform_1.10.0-alpha20241009` or higher. We recommend downloading the [latest version of Terraform](https://releases.hashicorp.com/terraform/) to use the most up-to-date functionality.

To export information from one Stack to another, declare the `publish_output` block in your deployment configuration to export data that another Stack can intake.

The `publish_output` block is similar to the `output` block in the Terraform configuration language, but the `publish_output` block declares values available at the Stack level. Add a `publish_output` block for each value you want to output from your current Stack.

### Complete configuration

When every field is defined, a `publish_output` block has the following form:

<CodeBlockConfig hideClipboard>

```hcl
publish_output "output_name" {
description = "Description of the purpose of this output"
value = deployment.deployment_name.some_value
}
```

</CodeBlockConfig>

### Specification

This section provides details about the fields you can configure in the output block.

| Field | Description | Type | Required |
| :---- | :---- | :---- | :---- |
| `value` | The value to output. | any | Required |
| `description` | A human-friendly description for the output. | string | Optional |

For example, you could output the VPC ID from your networking deployment, making it available to other Stacks to input.

<CodeBlockConfig filename="network.tfdeploy.hcl">

```hcl
# Network Stack's deployment configuration

publish_output "vpc_id" {
description = "The networking Stack's VPC's ID."
# Referencing a particular deployment can be helpful for sharing information
# across your Stacks.
value = deployment.network.vpc_id
}
```

</CodeBlockConfig>

This example uses the `publish_output` block to expose the `vpc_id` output from the `network` deployment of this Stack. Another Stack in the same project could then intake your `vpc_id` output by declaring an `upstream_input` block.

To learn more about passing information between Stacks, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).

## `upstream_input` block configuration

-> **Note**: The `upstream_input` block requires at least Terraform version `terraform_1.10.0-alpha20241009` or higher. We recommend downloading the [latest version of Terraform](https://releases.hashicorp.com/terraform/) to use the most up-to-date functionality.

To read outputs from other Stacks in your project, declare an `upstream_input` block for each Stack you want to reference. If the Stack's output values change, HCP Terraform automatically triggers runs for any Stacks that depend on those outputs.

We refer to the Stack that declares an `upstream_input` block as the downstream Stack. An upstream Stack publishes outputs using the `publish_output` block, and a downstream Stack consumes those outputs. For example, if Stack B relies on outputs from Stack A, Stack B is the downstream Stack.

### Complete configuration

When every field is defined, an `upstream_input` block has the following form:

<CodeBlockConfig hideClipboard>

```hcl
upstream_input "upstream_stack_name" {
type = "stack"
source = "app.terraform.io/{organization_name}/{project_name}/{upstream_stack_name}"
}
```

</CodeBlockConfig>

### Specification

This section provides details about the fields you can configure in the `upstream_input` block.

| Field | Description | Type | Required |
| :---- | :---- | :---- | :---- |
| `type` | The only supported type is “stack”. | string | Required |
| `source` | The upstream Stack’s URL, in the format: `"app.terraform.io/{organization_name}/{project_name}/{upstream_stack_name}"` | string | Required |

For example, you could input a VPC ID from an upstream Stack that manages your shared networking service. You can use the `upstream_input` block to pass information from your network Stack into your application Stack.

<CodeBlockConfig filename="application.tfdeploy.hcl">

```hcl
# Application Stack's deployment configuration

upstream_input "network_stack" {
type = "stack"
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
}

deployment "application" {
inputs = {
vpc_id = upstream_input.network_stack.vpc_id
}
}
```

</CodeBlockConfig>

Your application Stack can now securely consume and use outputs from your network Stack. To learn more about passing information between Stacks, reference [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).
Loading