Data Operations and Engg
Responsibilities:Data Ops Engineer - JDQualifications:5+ years of overall software engineering experience that includes hands-on software development, data engineering3+ years for hands-on coding experience in SQL, Python, PySpark3+ years of experience with advanced orchestration tools like Apache Airflow2+ years of experience in at least one cloud (Azure, AWS, GCP) platforms, preferably GCPExperience in building CI/CD processes and pipelinesRoles andResponsibilities:Worked in an agile environmentProactively identify and assist in solving recurring data quality or data availability issuesMonitor, support, triage data pipelines that ingest, move, transform, and integrate as it moves from acquisition to consumption layersExceptional problem solving and troubleshooting skills, analyze data to figure out issues/patternsEffective communication skills with technical and business teamsAspire to be efficient, thorough, and proactiveAble to develop queries, metrics for data platform related ad-hoc reporting and/or ETL batch triageMaintain knowledge base and FAQ documentation providing instructions for solving a problem that jobs commonly run into.Apply tot his job