Delta Exchange Logo

Delta Exchange

Senior Data Engineer

Sorry, this job was removed at 02:11 p.m. (IST) on Monday, Mar 23, 2026
Be an Early Applicant
Remote
Hiring Remotely in India
Remote
Hiring Remotely in India

Similar Jobs

5 Days Ago
In-Office or Remote
IN
Senior level
Senior level
Insurance
Design and develop scalable data pipelines using PySpark and Databricks, focusing on data ingestion, transformation, validation, and performance optimization.
Top Skills: DatabricksPysparkPythonSparkSQL
6 Days Ago
In-Office or Remote
Senior level
Senior level
AdTech • Marketing Tech • Sales
The Senior Data Engineer will design and maintain scalable ETL pipelines, work with large datasets, ensure data quality, and mentor junior engineers.
Top Skills: AirflowAksSparkAvroAzure DatabricksAzure Kubernetes ServiceCi/CdDelta LakeGitJSONParquetPostgresPythonSpark Sql
6 Days Ago
In-Office or Remote
Senior level
Senior level
Pharmaceutical
The Senior Data Engineer will collaborate with the R&D function within Corp IT Data and Analytics team to design and develop ETL workflows, manage data architecture, ensure software quality, and provide technical mentorship and consulting in the pharmaceutical industry.
Top Skills: Azure Analysis ServicesAzure SqlDatabricksInformaticaInformatica Data QualityAzureQlikSynapse

About the Company

At Delta, we are reimagining and rebuilding the financial system. Join our team to make a positive impact on the future of finance.

🎯 Mission Driven: Re-imagine and rebuild the future of finance.

💡 Most innovative cryptocurrency derivatives exchange. With a daily traded volume of ~$ 3.5 billion, and increasing. Delta is bigger than all the Indian crypto exchanges combined.

📈 Offer the widest range of derivative products and have been serving traders all over the globe since 2018 and growing fast.

💪🏻 The founding team is comprised of IIT and ISB graduates. Business co-founders have previously worked with Citibank, UBS and GIC; and our tech co-founder is a serial entrepreneur who previously co-founded TinyOwl and Housing.com.

💰 Funded by top crypto funds (Sino Global Capital, CoinFund, Gumi Cryptos) and crypto projects (Aave and Kyber Network).

Role Summary:

Support our analytics team by owning the full ETL lifecycle—from master data to analytics-ready datasets. You will build and maintain daily batch pipelines that process 1–10 million master-data rows per run (and scale up to tens or hundreds of millions of rows), all within sub-hourly SLAs. Extract from OLTP and time-series sources, apply SQL/stored-procedure logic or Python transformations, then load into partitioned, indexed analytics tables. Reads run exclusively on read-only replicas to guarantee zero impact on the production master DB. You’ll also implement monitoring, alerting, retries, and robust error handling to ensure near-real-time dashboard refreshes.


Requirements

Required Skills & Experience: 

* 4+ years in data engineering or analytics roles, building daily batch ETL pipelines at 1–10 M rows/run scale (and up to 100 M+).

* Expert SQL skills, including stored procedures and query optimisation on Postgres, Mysql, or similar RDBMS.

* Proficient in Python for data transformation (pandas, NumPy, SQLAlchemy, psycopg2).

* Hands-on with CDC/incremental load patterns and batch schedulers (Airflow, cron).

* Deep understanding of replicas, partitioning, and indexing strategies.

* Strong computer-science fundamentals and deep knowledge of database internals—including storage engines, indexing mechanisms, query execution plans and optimisers for MySQL and time-series DBs like TimescaleDB.

* Experience setting up monitoring and alerting (Prometheus, Grafana, etc.).

Key Responsibilities:

1. Nightly Batch Jobs: Schedule and execute ETL runs.

2. In-Database Transformations: Write optimised SQL and stored procedures.

3. Python Orchestration: Develop Python scripts for more complex analytics transformations.

4. Data Loading & Modelling: Load cleansed data into partitioned, indexed analytics schemas designed for fast querying.

5. Performance SLAs: Deliver end-to-end sub-hourly runtimes.

6. Monitoring & Resilience:Implement pipeline health checks, metrics, alerting, automatic retries, and robust error handling.

7. Stakeholder Collaboration: Work closely with analysts to validate data quality and ensure timely delivery of analytics-ready datasets.

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account