Snowflake Data Engineering Lead

Remote Full-time
Take your career to the next level with the only consulting firm born in AI and delivering with AI. At Atrium, we’re not simply adapting to an AI-driven world — we’ve helped define it since we were founded. Our clients partner with us because we turn potential into measurable impact, reshaping industries, realizing exponential value, and empowering organizations to thrive in an era of unprecedented technological advancement. As pioneers in AI-assisted delivery, we’re constantly optimizing how we deliver services for greater speed, accuracy, and efficiency.This commitment allows us to repeatedly deliver outcomes that other Salesforce and Snowflake partners merely promise. Care to join us? Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds.What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In this role, you will:• Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and bolthires optimization• Assemble large, complex data sets that meet functional / non-functional business requirements• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.• Lead and mentor both onshore and offshore development teams, creating a collaborative environment• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools• Development of ELT processes to ensure timely delivery of required data for customers• Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data• Design, implement, and maintain data models that can support the organization's data storage and analysis needs• Deliver technical and functional specifications to support data governance and knowledge sharingIn this role, you will have:• Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education• 6+ years of experience delivering consulting services to medium and large enterprises.Implementations must have included a combination of the following experiences:• Data Warehousing or Big Data consulting for mid-to-large-sized organizations• 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities• Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture• SnowPro Core certification is highly desired• Hands-on experience with Python (Pandas, Dataframes, Functions)• Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design• Strong Experience with Apache Airflow and API integrations• Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.)• Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies• Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment• Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions• Strong presentation and communication skillsNextRecruiting at Atrium is highly personalized.While some candidates may complete the hiring process quickly, others may take a bit longer, depending on the role and its requirements. We’re excited to get to know you and ensure you get to know our team along the way. At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer, and all qualified applicants will receive consideration for employment. Apply tot his job
Apply Now →

Similar Jobs

← Back to Home