Mastering archive_file in Terraform Like a Pro
Hey friends! 💅 If you’re as obsessed with Terraform as I am (because who doesn’t love automating infrastructure the chic way?!), then you NEED to know about the archive_file
data source. It’s basically your bestie for packaging files into ZIP or TAR formats, making deployments effortless. No more manual zipping—Terraform does it for you. 😍
Today, I’m breaking down everything you need to know about archive_file
, how to create archives from single or multiple files, and troubleshooting tips because we all know tech can be a little moody sometimes. 🙄
🌟 What is archive_file
in Terraform?
Think of archive_file
as the glam squad for your code—it takes your files and bundles them into neat little ZIP or TAR packages. Super useful for:
✨ Deploying AWS Lambda functions (which require zipped code!) ✨ Packaging Kubernetes manifests or Helm charts ✨ Bundling files before uploading to S3, Azure Storage, or Google Cloud Storage
Here’s how you set it up:
data "archive_file" "example" {
type = "zip" # Options: zip, tar, tgz
source_dir = "path/to/source" # Directory to compress
output_path = "path/to/output.zip" # Destination of the archive
}
Or, if you’re working with just one file:
data "archive_file" "example_file" {
type = "zip"
source_file = "path/to/file.txt"
output_path = "path/to/output.zip"
}
💡 Pro Tip: Use source_dir
for entire folders and source_file
for single files—but NOT both in the same block. Terraform’s picky like that. 😉
📂 Creating an Archive from a Single File
Let’s say you need to ZIP up a single file (hello, AWS Lambda deployments!). It’s super simple:
data "archive_file" "example" {
type = "zip"
source_file = "example.txt"
output_path = "example.zip"
}
output "archive_checksum" {
value = data.archive_file.example.output_base64sha256
}
This output_base64sha256
is a SHA-256 checksum to make sure your file stays intact. You don’t need it, but it’s a nice way to keep things in check. ✅
📌 Example: Zipping a Python Script for AWS Lambda
Say you have lambda_function.py
and need to deploy it to AWS Lambda:
data "archive_file" "lambda_zip" {
type = "zip"
source_file = "lambda_function.py"
output_path = "lambda_function.zip"
}
Then use it in your Lambda function:
resource "aws_lambda_function" "my_lambda" {
function_name = "my_lambda_function"
role = aws_iam_role.lambda_role.arn
runtime = "python3.8"
handler = "lambda_function.lambda_handler"
filename = data.archive_file.lambda_zip.output_path
source_code_hash = data.archive_file.lambda_zip.output_base64sha256
}
✨ Heads up! If your function has dependencies, you need to zip the whole folder, not just the script.
🎀 Creating Archives from Multiple Files
Ugh, I wish Terraform let us specify multiple files directly, but no, it’s a diva. 👀 The workaround? Gather files in a temp folder first:
resource "terraform_data" "prepare_files" {
provisioner "local-exec" {
command = <<EOT
mkdir -p temp_folder
cp ${path.module}/file1.txt temp_folder/
cp ${path.module}/file2.txt temp_folder/
cp ${path.module}/file3.txt temp_folder/
EOT
}
}
data "archive_file" "multiple_files" {
type = "zip"
source_dir = "${path.module}/temp_folder"
output_path = "${path.module}/multiple_files.zip"
depends_on = [terraform_data.prepare_files]
}
If you just want to archive an entire directory:
data "archive_file" "example" {
type = "zip"
source_dir = "${path.module}/my_folder"
output_path = "${path.module}/example.zip"
}
🚀 Uploading an Archive to Azure Storage
For my Azure girls 💙, here’s how to compress a folder and upload it to an Azure Storage Blob:
data "archive_file" "app_package" {
type = "zip"
source_dir = "${path.module}/my_app_folder"
output_path = "${path.module}/app_package.zip"
}
Now, create a storage account and upload it:
resource "azurerm_storage_blob" "example" {
name = "app_package.zip"
storage_account_name = azurerm_storage_account.example.name
storage_container_name = azurerm_storage_container.example.name
type = "Block"
source = data.archive_file.app_package.output_path
depends_on = [data.archive_file.app_package]
}
😵💫 Troubleshooting archive_file Issues
Terraform acting up? Here’s how to fix common archive_file
headaches:
1️⃣ Incorrect Source Path → Check your file paths! Use ${path.module}
to avoid mistakes.
2️⃣ Missing zip Utility → Terraform needs zip
installed. Fix it with:
-
Linux:
sudo apt install zip
-
macOS:
brew install zip
-
Windows: Ensure
zip.exe
is in your PATH.
3️⃣ Files Not Updating? → Terraform won’t detect changes inside source_dir
. Use filemd5()
like this:
output "archive_hash" {
value = filemd5(data.archive_file.example.output_path)
}
4️⃣ Permission Issues → If AWS Lambda or Docker throws permission errors, chmod your files before zipping:
chmod +x my_script.sh
5️⃣ Wrong Paths in Modules → Always use ${path.module}
, not ${path.root}
!
💡 Key Takeaways
💖 archive_file
is a Terraform essential for packaging files before deployment.
💖 Use source_file
for one file and source_dir
for entire folders.
💖 Troubleshoot common issues with paths, permissions, and missing utilities.
Terraform is fun (and powerful), so go forth and automate like the IT queen you are! 👩💻✨
P.S. Terraform 1.5.x and earlier is open-source, but newer versions are under the BUSL license. If you want a fully open-source alternative, check out OpenTofu—it’s like Terraform but community-driven. 💜
Refill My Coffee Supplies
💖 PayPal
🏆 Patreon
💎 GitHub
🥤 BuyMeaCoffee
🍪 Ko-fi
Follow Me
🎬 YouTube
🐦 X / Twitter
🎨 Instagram
🐘 Mastodon
🧵 Threads
🎸 Facebook
🧊 Bluesky
🎥 TikTok
💻 LinkedIn
🐈 GitHub
Is this content AI-generated?
Absolutely not! Every article is written by me, driven by a genuine passion for Docker and backed by decades of experience in IT. I do use AI tools to polish grammar and enhance clarity, but the ideas, strategies, and technical insights are entirely my own. While this might occasionally trigger AI detection tools, rest assured—the knowledge and experience behind the content are 100% real and personal.