Senior GCP Data Platform Engineer
We are looking for a Data Platform Engineer to review and help us improve the architecture and delivery of our data environment on Google Cloud Platform. Your primary focus is Infrastructure as Code (IaC) and Automation. You do not need to be a Data Analyst or Visualization expert. Your job is to set up the environment (Terraform), the pipelines (GitHub Actions), and the security (IAM/Secret Manager) so that our data team can immediately start using Cloud Run and Dataform without worrying about the underlying plumbing. The Tech Stack: • Infrastructure: Terraform (Modular, reusable, and parameterized) • CI/CD: GitHub Actions (Automated pipelines for infrastructure and data code) • Data Ingestion: Cloud Run / Python + dlthub (you do not need to write the ingestion pipelines). • Data Warehouse: BigQuery (dataset provisioning) • Data Modeling: Dataform (configuration & connectivity only, no data modelling required) • Security: Cloud Secret Manager & IAM Key Responsibilities: 1. Automated Platform Delivery (The "Push-Button" Solution) • Develop a Terraform codebase that allows us to spin up a complete environment for a new client from scratch by simply passing a configuration file (e.g., Client Name, Region). • Ensure all necessary APIs, Service Accounts, and Storage Buckets are automatically provisioned. • Implement Cloud Secret Manager to handle all keys and secrets securely. 2. CI/CD & Integration: • Build GitHub Actions pipelines that handle the deployment of the infrastructure. • Configure the pipelines to naturally flow into Dataform as the base transformation model. 3. Ingestion Framework: • Create a streamlined pattern for data ingestion. The system should allow users to add data sources + dlt ingestion scripts easily (stage to GCS and final load destination BigQuery landing zone). • Define the "handshake" protocol: How the platform accepts data and how it hands it off to the data modeling layer. 4. Documentation & Handover: • Deployment Guide: A strict step-by-step guide on how to deploy the platform for a new client. • User Instructions: How analysts can access the console, how to add ingestion pipelines, Dataform is self explanatory I think… • Final Validation: Present the solution and demonstrate a test run: Deploy Infra - Ingest Dummy Data - Verify Dataform can see it and access it. - Finally add second dummy data ingestion to trigger CI/CD pipeline and deployment strategy. Candidate Requirements: • Expert in Terraform on GCP: You must be able to write clean, modular Terraform that abstracts away complexity. • CI/CD Specialist: Strong experience creating GitHub Actions for deploying cloud resources and application code. • Security Mindset: Proficiency in IAM roles and Secret Manager. • Note: You do NOT need to be an expert in writing data ingestion pipelines (Python/dlt), SQLX (Dataform) or designing dashboards (Looker Studio). We require the code to be modular. You just need to ensure these tools are installed, connected, fully documented and ready for the analysts to use. Process: 1. Initial chat - discuss background and initial thoughts on project (10-15 mins). 2. Discuss and/or present plan of action/delivery/architecture (10-15 mins). Apply tot his job