Job Description
- Design and develop robust, scalable, and efficient data architectures and pipelines using IDMC, IICS, Datastage, and other ETL tools to support business with their enterprise data management capabilities.
- Develop and maintain data model and schemas to support data integration and analysis.
- Build and maintain data engineering solutions on cloud platform using AWS or Azure services.
- Develop and debug code in SQL/Python and possess query optimization techniques.
- Optimize data pipelines for performance, reliability, and cost effectiveness.
- Stay updated with the latest technologies and trends in ETL, Data Integration, Data Warehousing, Data Management, Big Data and Data Engineering, and provide recommendations for process improvements.
- 8- 10 years of experience with Informatica (IDMC/ IICS) Application and Data integration services and IBM Datastage
- Proficient in analyzing business processes requirements and translating them into technical requirements
- Design, develop, and maintain integration solutions using IDMC/ IICS and manage data flow between various internal and external systems
- Extensive experience working with various RDBMS and data sources (SQL Server, Postgres, Oracle, flat files)
- Experience in implementing performance optimizations techniques within IDMC/ IICS or any other ETL tools
- Proficient in Data Modeling, writing complex SQL queries, SQL query tuning
- Knowledge of database objects like schemas, table, views, stored procedures, functions
- Exposure to CI/CD processes and tools (e.g., Ansible, Jenkins)
- Experience working with cloud database like AWS Redshift, Snowflake
- Experience in Scripting (using Python or any other scripting language)
- Familiarity with Agile methodology(i.e., Scrum, Kanban)
- Hands on experience with Data Warehousing, ETL and Business Intelligence/Analysis
- Solid understanding of relational and dimensional data models.
- Experience working with BFSI clients
- Experience in Data Migration project
- Experience working in AWS environment using AWS services (S3, Redshift, EC2, Lambda, Glue)
- Familiarity with data visualization tool such as Tableau and Power BI
- Understanding of scheduling application such as Stone branch, Control M
- Experience or conceptual understanding of with Big Data Technologies (e.g., Hadoop, Spark)
- Experience in designing solution using Web API
· Across 7 USI Deloitte office locations. Candidates should reside within commutable distance of assigned Deloitte location.
Qualification
Bachelors or master’s degree