Sr. Data Engineer (GCP, Databricks, SQL & ETL)

Remote Full-time
DESCRIPTIONWe’re looking for a skilled Data Engineer with strong expertise in Databricks and SQL to join our data analytics team. You will work as part of a cross-functional team to design, build, and optimize data pipelines, frameworks, and warehouses to support business-critical analytics and reporting. The role requires deep-on experience in SQL-based transformations, Databricks, and modern data engineering practices. Roles &Responsibilities:Design, develop, and maintain data pipelines and ETL processes using Databricks and SQL.Write optimized SQL queries for data extraction, transformation, and loading across large-scale datasets. Monitor, validate, and optimize data movement, cleansing, normalization, and updating processes to ensure data quality, consistency, and reliability. Collaborate with business and analytics teams to define data models, schemas, and frameworks within the data warehouse. Document source-to-target mapping and transformation logic. Build data frameworks and visualizations to support analytics and reporting.Ensure compliance with data governance, security, and regulatory standards. Communicate effectively with internal and external stakeholders to understand and deliver on data needs. Bachelor’s degree in Computer Science, Data Engineering, or related field. REQUIREMENTSQualifications &Experience: Data Engineer (Databricks, SQL)6+ years of hands-on experience in ETL/data engineering. Proficiency in the Python programming language. Strong SQL development experience (query optimization, indexing strategies, stored procedures).3+ years of experience in Spark. 3+ years of Databricks experience with Python/Scala. Experience with cloud platforms - GCP (preferred.)Databricks or cloud certifications are a strong plus. Apply tot his job
Apply Now →
← Back to Home