The Data Engineer collaborates with Group Risk to create and support data pipelines for IFRS9 reporting, utilizing PySpark and Informatica BDM, ensuring data quality, and overseeing ETL processes.
We are seeking a skilled Data Engineer to join the Group Risk team, responsible for building and managing robust data pipelines to support IFRS9 reporting. The role involves close collaboration with business stakeholders to understand data requirements, perform impact analysis, and deliver high-quality data solutions using modern data engineering technologies such as PySpark and Informatica BDM.
Key Responsibilities- Collaborate with the Group Risk Team to gather and understand business and data requirements
- Perform impact assessment and technical data mapping for new and existing data sources
- Conduct data profiling to ensure data quality, consistency, and completeness
- Design, develop, and maintain ETL pipelines using PySpark and Informatica BDM
- Build scalable data transformation workflows aligned with IFRS9 data models
- Ensure accurate data extraction, transformation, and loading (ETL) into reporting systems
- Participate in unit testing, validation, and deployment of data pipelines
- Optimize data processing performance and troubleshoot production issues
- Adopt modern tools (e.g., AI-assisted tools like Claude) to improve productivity, reduce errors, and enhance development workflows
- Maintain proper documentation for data flows, mappings, and processes
- Strong experience in PySpark for large-scale data processing
- Hands-on experience with Informatica BDM (Big Data Management)
- Solid understanding of ETL concepts, data warehousing, and data modeling
- Experience with data profiling, data mapping, and impact analysis
- Knowledge of IFRS9 or Risk/Banking domain is highly preferred
- Familiarity with distributed data processing frameworks and big data ecosystems
- Strong SQL skills and experience working with relational databases
- Good understanding of data quality and governance principles
- Exposure to cloud platforms (AWS / Azure / GCP)
- Experience with AI-assisted development tools (e.g., Claude, GitHub Copilot)
- Knowledge of CI/CD pipelines in data engineering workflows
- Strong analytical and problem-solving skills
- Excellent communication and stakeholder management abilities
- Ability to work in a fast-paced, collaborative environment
Similar Jobs
Fintech • Payments • Software • Financial Services • Automation
This role involves developing and optimizing data solutions using PySpark and Informatica BDM while collaborating with various teams on Risk & Compliance platforms.
Top Skills:
Informatica BdmPysparkSQL
Artificial Intelligence • Big Data • Cloud • Software
As a Consultant, you will partner with customers, gather functional requirements, provide recommendations, ensure delivery quality, and handle documentation. Responsibilities include managing projects, engaging with stakeholders, and supporting agile methods for implementation.
Top Skills:
JIRAProsciPythonR
Artificial Intelligence • Big Data • Cloud • Software
The Program Manager will oversee B2B customer implementation projects, ensuring they are delivered on time and within budget while managing stakeholder relationships and coordinating cross-functional teams.
Top Skills:
JIRAMs ProjectSmartsheet
What you need to know about the Bengaluru Tech Scene
Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

