We are looking for a skilled Data Warehousing Support individual to be part of a highly integrated , cross-disciplined team that will be responsible for supporting modern ETL services for cross systems, applications and Business Intelligence leveraging Cloud Data Warehouses, SAP, SFDC and other enterprise databases .
The Job
Analyze, develop and support ETL and CDC pipelines in and out of data warehouse with a focus on automation, performance, reliability, durability, data quality, security and SLA expectations
Provide production support for Data Warehouse issues such data load problems, transformation translation problems
Perform data mappings and design of system for ETL/CDC workflow solutions
Adhere to and promote CI/CD best practices, processes, and deliverables in accordance with modern standards in a DataOps environment
Work closely with business analysts to implement integration technology solutions that meet the specifications of a project or service request
Develop automated data audit and validation processes
Participate in architectural design review sessions and ensure all solutions are aligned to pre-defined specifications
Ensure accurate and timely data availability to meet business SLA, especially around critical time periods
Document ETL and data warehouse processes and flows
Qualifications & Skills required
Ability to support and/or develop ETL pipelines and scripts in and out of data warehouse using combination of HVR, Goldengate, Snaplogic , Python and Snowflake s SnowSQL
Experience with different source systems like SAP, HANA, OLFM and SalesForce preferred
Strong SQL skills including stored procedures
Extensive experience performance tuning, troubleshooting and debugging solutions
Experience in supporting Data Warehouse and data transformation issues
Demonstrated experience providing production support during time sensitive situations
Experience in integrating premise infrastructure with public cloud (AWS, AZURE, Snowflake)
Experience with AIX/Linux shell scripting
Experience performing detailed data analysis ( ie determining the structure, content, and quality of the data through examination of source systems and data samples)
Ability to translate requirements for BI and Reporting and data transformations
Ability to design and outline solutions using cloud based and on-premises technologies
Ability to understand data pipelines and modern ways of automating data pipeline
Actively test and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions
Expert in source and target system analysis and prior experience on project analysis and cost estimations
Ability to work effectively in a remote team environment
Excellent written and verbal communication, time management, and presentation skills
Additional Skills
Problem-Solving: Ability to troubleshoot and resolve technical issues efficiently.
Communication: Effective communication skills to interact with stakeholders and team members.
Analytical Thinking: Ability to analyse data and identify patterns and trends.
Attention to Detail: Meticulous approach to ensure data accuracy and integrity.
Adaptability: Willingness to learn new technologies and adapt to changing requirements.
Time Management: Effective time management skills to meet project deadlines.
Adaptability: Flexibility to adapt to changing project requirements.
Stakeholder Management: Ability to manage expectations and build relationships with stakeholders.
Experience
More than 1 years experience with Snowflake cloud-based data warehouse
Hands-on experience with at least one Snowflake project
6-8 years experience developing ETL, ELT and Data Warehousing solutions
6-8 years experience with HVR or Snaplogic (preferred)
More than 3 years experience with CDC technologies(preferred)
More than 3 years experience AWS cloud, Azure or Google cloud
3-5 years experience developing Python based code that reads/writes data into databases
3-5 years experience developing SQL scripts and stored procedures that process data from databases
3-5 years experience with batch job scheduling and identifying data/job dependencies
Strong understanding of various data formats such as CSV, XML, JSON etc.