[Remote] ETL Informatica Developer (100% REMOTE/NO C2C)

Remote Full-time
Note:The job is a remote job and is open to candidates in USA. Amerit Consulting is a fast-growing staffing and consulting firm that provides services to Fortune 500 companies and small to mid-sized organizations. They are seeking an accomplished ETL Informatica Developer responsible for administering and implementing enterprise data process automation, managing data pipelines, and providing technical support to ensure effective data utilization. Responsibilities• 5+ years of experience in administering, testing, and implementing enterprise data process automation and orchestration.• Experience with Data processing platforms and technologies such as Microsoft SSIS, Informatica, ActiveBatch, Power Apps, Apache Airflow, Apache Nifi, Job Schedulers, File transfer tools, etc. • Knowledge of or experience with data virtualization technology (Denodo)• Responsible for providing full lifecycle administration of data platform tools (patches/updates, AD security, account management, capacity management, documenting processes)• Primary platform support would revolve around PowerBI and supporting user security, but also serve as a backup for other department software platforms (Denodo, Informatica, ActiveBatch, etc.)• Data management & modelling: Connect and manage data pipeline tools to various data sources, including on-premise and cloud-based data sources.• Implement and maintain semantic models to ensure data integrity and performance optimization. Perform data cleansing and transformation tasks (ETL) to prepare data for analysis. • Technical support: Provide technical guidance and support to consumers of data services, ensuring effective adoption and utilization of enterprise data and the fabric/virtual layer. • Performance monitoring: Monitor and optimize Data Pipeline (ETL) performance, including capacity planning and server performance. • User management: Manage user access and permissions to enterprise data platforms and resources, ensuring compliance with security policies.• Troubleshooting: Conduct thorough testing, debugging, and troubleshooting of Data Pipleline (ETL) tools and solutions. • Governance: Maintain governance policies, best practices, and security standards for the enterprise data platforms. • Training and knowledge sharing: Provide training and share knowledge with colleagues to enable the delivery of data for enterprise needs. • Roadmap building and prioritization: Support the data architecture team with data pipeline (ETL) roadmap, prioritizing initiatives based on business needs and strategic goals• Skilled in analyzing and automating manual processes to reduce manual interaction• Experience with data virtualization/fabric platforms such as Denodo, CData, Talend, Data Virtuality• Experience with and utilizing development skills such as SQL, PL/SQL, T-SQL, Shell Scripting (Powershell, Unix Shell, etc.)• Able to analyze, troubleshoot and tune SQL queries and recommend enhancements.• Analyzing and monitoring server resources and implement proactive alerts and notifications based on SLAs. • Performance tuning and analysis of SQL code and logic in data transformations and queries. • Relevant certifications related to data platforms and relevant technologies. • Experience in the healthcare claims processing industry and understanding of associated data security and privacy concerns. Skills• 5+ years of experience in administering, testing, and implementing enterprise data process automation and orchestration.• Experience with Data processing platforms and technologies such as Microsoft SSIS, Informatica, ActiveBatch, Power Apps, Apache Airflow, Apache Nifi, Job Schedulers, File transfer tools, etc. • Knowledge of or experience with data virtualization technology (Denodo)• Responsible for providing full lifecycle administration of data platform tools (patches/updates, AD security, account management, capacity management, documenting processes)• Primary platform support would revolve around PowerBI and supporting user security, but also serve as a backup for other department software platforms (Denodo, Informatica, ActiveBatch, etc.)• Data management & modelling: Connect and manage data pipeline tools to various data sources, including on-premise and cloud-based data sources.• Implement and maintain semantic models to ensure data integrity and performance optimization. Perform data cleansing and transformation tasks (ETL) to prepare data for analysis. • Technical support: Provide technical guidance and support to consumers of data services, ensuring effective adoption and utilization of enterprise data and the fabric/virtual layer. • Performance monitoring: Monitor and optimize Data Pipeline (ETL) performance, including capacity planning and server performance. • User management: Manage user access and permissions to enterprise data platforms and resources, ensuring compliance with security policies.• Troubleshooting: Conduct thorough testing, debugging, and troubleshooting of Data Pipeline (ETL) tools and solutions. • Governance: Maintain governance policies, best practices, and security standards for the enterprise data platforms. • Training and knowledge sharing: Provide training and share knowledge with colleagues to enable the delivery of data for enterprise needs. • Roadmap building and prioritization: Support the data architecture team with data pipeline (ETL) roadmap, prioritizing initiatives based on business needs and strategic goals• Skilled in analyzing and automating manual processes to reduce manual interaction• Experience with data virtualization/fabric platforms such as Denodo, CData, Talend, Data Virtuality• Experience with and utilizing development skills such as SQL, PL/SQL, T-SQL, Shell Scripting (Powershell, Unix Shell, etc.)• Able to analyze, troubleshoot and tune SQL queries and recommend enhancements.• Analyzing and monitoring server resources and implement proactive alerts and notifications based on SLAs. • Performance tuning and analysis of SQL code and logic in data transformations and queries. • Relevant certifications related to data platforms and relevant technologies. • Experience in the healthcare claims processing industry and understanding of associated data security and privacy concerns. Company Overview• Amerit Consulting is a staffing and recruiting company that offers temporary staffing and payrolling services.It was founded in 2002, and is headquartered in San Ramon, California, USA, with a workforce of 1001-5000 employees. Its website is Apply tot his job
Apply Now →
← Back to Home