Data Migration Engineer, Snowflake, dbt

Remote Full-time
Job Description:• Designing, implementing, and maintaining data pipelines using dbt and Snowflake• Developing and automating Python scripts for data transformation, validation, and delivery• Managing data workflows and deployments across the AWS ecosystem (S3, Lambda, ECS, IAM, etc.)• Collaborating with internal and external teams to deliver efficient, secure data integrations• Troubleshooting and resolving data pipeline or performance issues• Applying best practices for bolthires/CD, testing, and version control in data workflows• Contributing to ETL orchestration and scheduling using MatillionRequirements:• Possess current experience with dbt and Snowflake (required).Please do not apply with this experience. • Experience with Matillion ETL or similar data orchestration tools• Familiarity with Airflow, Dagster, or other workflow orchestration frameworks• Have current and extensive Python development skills for automation and data processing• Possess a solid understanding of AWS services related to data engineering• Experience with SQL, schema design, and performance optimization• Possess familiarity with Git and collaborative development practicesBenefits:• HealthBenefits: Comprehensive, multi-carrier program for medical, dental and vision benefits• RetirementBenefits: 401(k) with match and an Employee Share Purchase Plan• Wellbeing: Wellness platform with incentives, Headspace app subscription, Employee Assistance and Time-off Programs• Short-and-Long Term Disability, Life and Accidental Death Insurance, Critical Illness, and Hospital Indemnity• Family Benefits, including bonding and family care leaves, adoption and surrogacy benefits• Health Savings, Health Care, Dependent Care and Commuter Spending Accounts• In addition to annual Paid Time Off, we offer up to two days of paid leave each to participate in Employee Resource Groups and to volunteer with your charity of choice Apply tot his job
Apply Now →
← Back to Home