Home Operations Automated Operations Packer 1.7: dynamic data devotion

Packer 1.7: dynamic data devotion


Packer is a wonderful little secret developed by HashiCorp. At its core, it is an image builder. However, it has of late been the secret stepchild of the company, hidden and unfavored over its more glamorous siblings, Terraform, Vault or Consul. In this post, we’ll dive into the Data Sources feature in Packer 1.7.

Packer 1.7
Packer 1.7: dynamic data devotion

For a good overview of the capabilities of Packer and where it can fit into your DevOps strategy have a read of these earlier articles on Amazic World.

Introduction to HashiCorp Packer
Building images and VMs in Azure with Terraform

As you can see from those, it is a powerful tool in building consistent images for your Terraform deployed infrastructure, across multiple platforms, including Azure, AWS, GCP and even locally installed installations deployed on traditional virtualization platforms like VMware vSphere.

It has limitations and there are issues to be sure, for example, there is currently a problem with Packer-created images that have been automatically converted to catalog templates failing to deploy to vSphere when using the EFI boot option. For a deeper understanding of this issue, read the predecessor to that issue, the link of which is found in the initial wordage. That said, although this is a major issue if you are attempting to deploy EFI BIOS-based Virtual machines into a vSphere environment it is an outlier that is an exception rather than the rule to Packer’s capabilities.

To me personally one of Packer’s major weaknesses is the fact that it is a very flat environment: the ability to dynamically bring in custom data to a build is limited. In Terraform we can use a construct known as a data source to dynamically configure constants such as the name of a to-be-created resource. Or the user and password to be obtained from a secret manager like Vault.

Well guess what: Hashicorp has started to move in that direction too with Packer 1.7. This is a massive step forwards in the functionality and the security posture of the product. It also marks a significant move in the evolution of Packer to an HCL2-based product, as these new functions are only enabled in HCL2-based templates and this feature is not going to be back-ported into the now legacy JSON template format. Thus, making it the first of what we expect to be many HCL2-exclusive features.

So, what can you do with this in Packer?

As already mentioned, the Packer 1.7 data source function is remarkably similar to the data source function in Terraform. Data source components fetch data from sources outside of the Packer environment and make that information available to Packer HCL configuration blocks. The data source runs before any build, thus gathering the necessary information to allow the subsequent build sources in the configuration file to access the result of the data source.

Where can this be used?

Currently, HashiCorp has only released the Amazon Web Services Data Source package, and this is only available for use with Packer 1.7 or above. This additional function pack includes two data sources: the Amazon AMI data source and the Amazon Secrets Manager data source.

For a full understanding of the capabilities of these two new functions click the links above; but on a high level, the Amazon AMI data source filters images from the marketplace and is similar to the source_ami_filter configuration. The Amazon Secrets Manager data source retrieves secrets for the build configuration, similar to the aws_secretsmanager configuration. Now both legacy configuration parameters will remain available for use in Packer builds but it is obvious that these two functions are to be deprecated and it is suggested that any configuration that uses these two functions is updated to use the new functionality. This will aid in the future stability of your builds.

This will require moving your configuration files from JSON to HCL2 at some point in the future, and HashiCorp has provided a nifty little upgrade command. What is even better is that the command will even take the leg work and upgrade the source_ami_filter and aws_secretsmanager options to their respective data sources for you.

So how do we use Data Sources?

You simply create a data block referencing your data source, it is then called as a variable in locals and sources with the form data.<TYPE>.<NAME>.<ATTRIBUTE>. The below example shows how to use the Amazon Secrets Manager data source as a local variable to store the value and version of a secret.

data "amazon-secretsmanager" "basic-example" {
name = "my_super_secret"
key  = "my_secret_key"
# usage example of the data source output
locals {
secret_value         = data.amazon-secretsmanager.basic-example.value
secret_version_id    = data.amazon-secretsmanager.basic-example.version_id

Finally, use them in your code by using the local.secret_value and local.secret_version_id anywhere in your configuration file. If you need further information have a read of HashiCorp’s documentation on using data sources.

If you are feeling adventurous you can write your own custom data source. For more information on how to do that follow the instructions for Custom Data Sources.

Currently has stated there are only two data sources, but the roadmap has already stated that the functions for Consul and Vault are next line for replacement.


Packer has been missing the love of the mothership for quite a while with all the attention being on Terraform, Vault and their cloud-based offerings. This package is quite interesting for multiple reasons: it brings the ability to dynamically change configurations between builds., and the fact that this feature is only available for HCL2 based configurations.

The movement to HCL2 across the product set means that a single language is going to be used across multiple products, and HashiCorp is putting in the work to move older products into that direction, too. This specific upgrade to Packer means HashiCorp hasn’t forgotten about Packer, and makes the tool all the more useful for secure image builds.


Sign up to receive our top stories directly in your inbox