Use it when you want to begin orchestration work without writing the first draft from scratch.
Infrastructure as Code for Data AI Prompt
Implement Infrastructure as Code (IaC) for this cloud data platform. Cloud provider: {{provider}} IaC tool: {{iac_tool}} (Terraform, Pulumi, CDK, Bicep) Components to provision:... Copy this prompt template, run it in your AI tool, and use related prompts to continue the workflow.
Implement Infrastructure as Code (IaC) for this cloud data platform.
Cloud provider: {{provider}}
IaC tool: {{iac_tool}} (Terraform, Pulumi, CDK, Bicep)
Components to provision: {{components}}
Team: {{team}}
1. Why IaC for data infrastructure:
- Reproducible: dev, staging, and prod environments are identical
- Version-controlled: infrastructure changes are reviewed like code
- Self-documenting: the Terraform / Pulumi code IS the documentation
- Auditable: every change is in git history with the author
2. Terraform for cloud data resources:
S3 bucket with lifecycle and logging:
resource "aws_s3_bucket" "data_lake" {
bucket = "${var.env}-data-lake-${var.account_id}"
tags = { Environment = var.env, Team = "data-engineering" }
}
Snowflake warehouse:
resource "snowflake_warehouse" "analytics" {
name = "ANALYTICS_WH"
warehouse_size = "SMALL"
auto_suspend = 60
auto_resume = true
}
3. Module structure:
modules/
data_lake/ # S3 bucket + lifecycle + IAM
snowflake_env/ # databases, warehouses, roles
airflow_mwaa/ # MWAA environment + networking
monitoring/ # CloudWatch dashboards + alarms
environments/
dev/main.tf # calls modules with dev variables
prod/main.tf # calls modules with prod variables
4. State management:
- Remote state: store in S3 + DynamoDB (AWS) or GCS (GCP) with locking
- State locking: prevents concurrent runs from corrupting state
- Separate state per environment: dev and prod should never share state
5. CI/CD for IaC:
PR: terraform plan → post plan output as PR comment
Merge to main: terraform apply (with approval gate for prod)
Tool: Atlantis (open-source) or Terraform Cloud for automated plan/apply
Return: Terraform module structure, resource examples, state management configuration, and CI/CD pipeline for IaC.When to use this prompt
Use it when you want a more consistent structure for AI output across projects or datasets.
Use it when you want prompt-driven work to turn into a reusable notebook or repeatable workflow later.
Use it when you want a clear next step into adjacent prompts in Orchestration or the wider Cloud Data Engineer library.
What the AI should return
The AI should return a structured result that covers the main requested outputs, such as Why IaC for data infrastructure:, Reproducible: dev, staging, and prod environments are identical, Version-controlled: infrastructure changes are reviewed like code. The final answer should stay clear, actionable, and easy to review inside a orchestration workflow for cloud data engineer work.
How to use this prompt
Open your data context
Load your dataset, notebook, or working environment so the AI can operate on the actual project context.
Copy the prompt text
Use the copy button above and paste the prompt into the AI assistant or prompt input area.
Review the output critically
Check whether the result matches your data, assumptions, and desired format before moving on.
Chain into the next prompt
Once you have the first result, continue deeper with related prompts in Orchestration.
Frequently asked questions
What does the Infrastructure as Code for Data prompt do?+
It gives you a structured orchestration starting point for cloud data engineer work and helps you move faster without starting from a blank page.
Who is this prompt for?+
It is designed for cloud data engineer workflows and marked as intermediate, so it works well as a guided starting point for that level of experience.
What type of prompt is this?+
Infrastructure as Code for Data is a single prompt. You can copy it as-is, adapt it, or use it as one step inside a larger workflow.
Can I use this outside MLJAR Studio?+
Yes. The prompt text works in other AI tools too, but MLJAR Studio is the best fit when you want local execution, visible Python code, and reusable notebooks.
What should I open next?+
Natural next steps from here are Cloud Orchestration with Airflow, Data Contracts and SLA Management, Pipeline Observability and Monitoring.