Easy Apply
Easy Apply
Design, build, and maintain scalable data pipelines and infrastructure while optimizing data systems. Collaborate with teams to ensure data reliability and compliance, and document internal standards.
Emergent builds autonomous coding agents that replace traditional software development by generating, testing, and deploying production applications directly from plain-language intent. Our systems run in production at global scale and are used to build millions of real applications.
Since public launch, Emergent has reached $50M ARR in 7 months. 5M+ users across 190+ countries have built 6M+ applications on Emergent. We’ve raised $100M, backed by Khosla Ventures, SoftBank, Google, Lightspeed, Prosus, Together, and Y Combinator.
We’re solving the hard part of AI-driven software creation: correctness, reliability, security, and scale in real production systems. The team is built by repeat founders, Olympiad medalists, IIT & IIM alumni, and leaders from Google, Amazon, and Dropbox.
We’re hiring builders who want ownership, speed, and impact at global scale.
What You’ll Do
- Design, build, and maintain robust, scalable data pipelines and infrastructure.
- Develop ETL/ELT processes to collect, process, and curate structured and unstructured data from various sources.
- Collaborate with data scientists, analysts, and product teams to understand data needs and ensure data reliability and accessibility.
- Optimize data systems for performance, cost-efficiency, and scalability.
- Ensure data quality, security, and compliance best practices are followed.
- Implement and maintain data models, data lakes, warehouses (e.g., Redshift, BigQuery, Snowflake), and real-time streaming platforms (e.g., Kafka, Spark).
- Document systems and contribute to internal data engineering standards and playbooks.
What We’re Looking For
- 3+ years of experience in a data engineering or similar backend role.
- Proficiency in Python, SQL, and one or more data orchestration tools (e.g., Airflow, Dagster, Prefect).
- Hands-on experience with cloud data platforms (AWS, GCP, or Azure).
- Familiarity with data modeling, warehousing concepts, and distributed systems.
- Strong understanding of data governance, privacy, and security principles.
- Experience with tools like dbt, Kafka, Spark, or similar technologies is a plus.
- Bonus: Background in ML pipelines, MLOps, or DevOps practices.
Why Join Us?
This isn't a traditional analyst position. You'll be pioneering analytics in the emerging field of agentic AI, where your work directly shapes how we understand and optimize AI agent
performance. You'll have the opportunity to:
- Be among the first analysts to define metrics and frameworks for agentic AI systems
- Work with cutting-edge technology while solving novel analytical challenges
- Have direct impact on product decisions with your insights driving feature development
- Shape the future of how businesses measure and optimize AI agent effectiveness
Let’s build the future of software together.
Top Skills
Airflow
AWS
Azure
BigQuery
Dagster
Dbt
GCP
Kafka
Prefect
Python
Redshift
Snowflake
Spark
SQL
Similar Jobs
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, implement and maintain Big Data solutions for Disability & Absence products. Build batch, speed, and serving layers using Spark, Hive, NoSQL, SOLR and Kafka; optimize Hadoop jobs; automate with Shell/Python; support deployments, CI/CD, runbooks, and cross-team collaboration.
Top Skills:
Spark,Hive,Nosql,Solr,Scala,Pig,Kafka,Hbase,Nifi,Change-Data-Capture,Hadoop,Ci/Cd,Shell,Python,Azure,Google Cloud
Automotive • Hardware • Robotics • Software • Transportation • Manufacturing
The Data Engineer will develop data-driven solutions, build data pipelines, support operations, and create BI tools while collaborating with stakeholders.
Top Skills:
Big Data PlatformsData FactoryData Visualization TechniquesEtl ToolsMS OfficeMySQLNosql DatabasesOracleSQL Server
Fintech • Information Technology • Financial Services
The Associate Data Engineer will lead ETL integration flows, implement data quality engineering, and manage periodic data migrations with a focus on master data management.
Top Skills:
Apache AirflowAWSAzureDbtDockerFastapiGithub ActionsKafkaKubernetesMongoDBMySQLPostgresPythonSnowflake
What you need to know about the Bengaluru Tech Scene
Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.



