Remote - GCP Data Engineer (Onsite) Santa Clara, CA

Remote Full-time
Hope you are doing well !! I have an urgent position. Kindly go through the Job description and let me know if this would be of interest to you. Title : GCP Data Engineer (Onsite)Duration : 6-12 MonthsLocation : Onsite in Santa Clara, CAAbout the jobMust have python SQL and GCP platform good exp. We are looking for a highly skilled and motivated Data Engineer to. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives.You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment. Key ResponsibilitiesDesign and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development. Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analyticsBig Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions. Required Skills and ExperienceCloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for bolthires Cloud Platform (GCP) services, specifically:BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment. Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/SubProgramming & Querying:Python: Expert-level programming proficiency in Python, including experience with relevant data engineering librariesSQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar). DevOps/bolthires/CD: Experience with version control (Git) and familiarity with bolthires/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes. If you are interested, please share your updated resume and suggest the best number & time to connect with youHimanshu GuptaUS IT RECRUITER, DMS VISIONS INCExt-104 |LinkedIn:4645 Avon Lane, Suite 210, Frisco, TX 75033 Apply tot his job
Apply Now →
← Back to Home