Design, develop, and manage data integration workflows and ETL pipelines using various tools and cloud platforms. Collaborate with teams to enhance data quality and optimize data flows through AI/ML technologies.
Huron is a global consultancy that collaborates with clients to drive strategic growth, ignite innovation and navigate constant change. Through a combination of strategy, expertise and creativity, we help clients accelerate operational, digital and cultural transformation, enabling the change they need to own their future.
Join our team as the expert you are now and create your future.
- 4-7years in data integration and ETL/ELT design, strong skills in Informatica or similar tools, proficiency in SQL and Python, experience with cloud data platforms, and familiarity with AI-driven data quality solutions.
- Design and optimize data integration workflows, ETL/ELT pipelines, and APIs using Informatica IICS and other iPaaS tools.
- Develop scalable pipelines across cloud platforms (AWS, Azure, GCP) and modern data warehouses (Snowflake, Databricks, BigQuery, Redshift).
- Implement data governance frameworks including data quality, lineage, and cataloging to ensure trusted and compliant data flows.
- Leverage AI/ML techniques for anomaly detection, predictive quality checks, and self-healing pipeline automation.
- Collaborate with cross-functional teams (architects, analysts, business stakeholders) to integrate structured, semi-structured, and unstructured data sources.
- Ensure robust deployment practices by integrating DevOps/CI-CD principles into data integration workflows.
- Document and standardize integration patterns, best practices, and reusable frameworks for enterprise-wide adoption.
Preferences:
- Hands-on experience with Kafka, Spark, Airflow, or event-driven architectures.
- Knowledge of REST APIs, microservices, and real-time data integration.
- Conceptual understanding or hands-on exposure to ML frameworks (Scikit-learn, TensorFlow, PyTorch).
- Experience contributing to AI-augmented/self-healing pipelines.
- Bachelor’s or master’s in computer science, Data Engineering, Information Systems, or related field
Top Skills
Airflow
AWS
Azure
BigQuery
Databricks
ETL
GCP
Iics
Informatica
Kafka
Python
Redshift
Rest Apis
Snowflake
Spark
SQL
Similar Jobs
Financial Services
As a Software Engineer II, design and develop scalable systems using AWS for ETL and reporting, ensuring high-quality production code and continuous improvements.
Top Skills:
Aurora MysqlAWSAws MskCloudFormationCloudwatchConfluent KafkaEmrETLEvent BridgeGlueGraphdbKafkaKnowledge GraphsLambdaNoSQLPostgresPysparkPythonQuicksightS3SigmaSQLSqsTerraform
Financial Services
As a Software Engineer III at JPMorganChase, you'll design scalable systems using AWS ETL, develop software solutions, manage databases, and contribute to agile practices in cloud technology.
Top Skills:
Aurora MysqlAWSCloudFormationCloudwatchConfluent KafkaEmrEvent BridgeGlueKafkaLambdaNoSQLPostgresPysparkPythonQuicksightSQLSqsTerraform
Pharmaceutical
As a Data and Analytics Developer, you will transform raw data into insights, develop BI and ETL solutions, and ensure data accuracy.
Top Skills:
AWSAzureGCPMySQLOraclePower BIQliksenseSQL ServerTalend
What you need to know about the Bengaluru Tech Scene
Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.


