Similar Jobs
Consumer Web • Digital Media • News + Entertainment
As a Software Development Engineer II, you'll build and maintain data infrastructure, develop data pipelines for AI initiatives, and optimize data processing systems.
Top Skills:
Apache AirflowApache FlinkApache KafkaSparkAWSAzureBigQueryCassandraCloudFormationDelta LakeDynamoDBGCPKinesisMySQLPostgresRedshiftSnowflakeTerraform
Aerospace • Information Technology • Cybersecurity • Defense • Manufacturing
The Senior Architect - BMS will design and implement software for battery management systems, lead cross-functional teams, manage design patterns, mentor junior engineers, and ensure compliance with industry standards.
Top Skills:
Azure DevopsBattery Management SystemsCloudDevOpsEmbedded CMatlabReal-Time Operating SystemsSimulink
Aerospace • Information Technology • Cybersecurity • Defense • Manufacturing
Lead the development and verification of control laws for aerospace systems using Model-Based Development, collaborating with cross-functional teams and ensuring compliance with certifications.
Top Skills:
CMatlabRtosSimulinkSimulink CheckSimulink TestStateflow
At SAFE Security, our mission is bold and ambitious: We Will Build CyberAGI — a super-specialized system of intelligence that autonomously predicts, detects, and remediates threats. This isn’t just a vision—it’s the future we’re building every day, with the best minds in AI, cybersecurity, and risk. At SAFE, we empower individuals and teams with the freedom and responsibility to align their goals, ensuring we all move towards this goal together.
We operate with radical transparency, autonomy, and accountability—there’s no room for brilliant jerks. We embrace a culture-first approach, offering an unlimited vacation policy, a high-trust work environment, and a commitment to continuous learning. For us, Culture is Our Strategy—check out our Culture Memo to dive deeper into what makes SAFE unique.
We’re looking for a seasoned Software Development Engineer (SDE II) – Data Engineering with deep expertise in building large-scale data ingestion, processing, and analytics platforms. In this role, you’ll collaborate closely with Product, Design, and cross-functional stakeholders to design and implement high-throughput data pipelines and lakehouse solutions that power analytics, AI, and real-time decision-making for predicting and preventing cyber breaches.
What You’ll Do:
- Design and Develop: Architect and implement high-scale data pipelines leveraging Apache Spark, Flink, and Airflow to process streaming and batch data efficiently.
- Data Lakehouse and Storage Optimization: Build and maintain data lakes and ingestion frameworks using Snowflake, Apache Iceberg, and Parquet, ensuring scalability, cost efficiency, and optimal query performance.
- Data Modeling and System Design: Design robust, maintainable data models to handle structured and semi-structured datasets for analytical and operational use cases.
- Real-time and Batch Processing: Develop low-latency pipelines using Kafka and Spark Structured Streaming, supporting billions of events per day.
- Workflow Orchestration: Automate and orchestrate end-to-end ELT processes with Airflow, ensuring reliability, observability, and recovery from failures.
- Cloud Infrastructure: Build scalable, secure, and cost-effective data solutions leveraging AWS native services (S3, Lambda, ECS, etc.).
- Monitoring and Optimization: Implement strong observability, data quality checks, and performance tuning to maintain high data reliability and pipeline efficiency.
What We’re Looking For:
- Bachelor’s or Master's degree in Computer Science, Engineering, or a related field
- 3+ years of experience in data engineering with a proven track record of designing large-scale, distributed data systems.
- Strong expertise in Snowflake and other distributed analytical data stores.
- Hands-on experience with Apache Spark, Flink, Airflow, and modern data lakehouse formats (Iceberg, Parquet).
- Deep understanding of data modeling, schema design, query optimization, and partitioning strategies at scale.
- Proficiency in Python, SQL, Scala, Go/Nodejs with strong debugging and performance-tuning skills.
- Experience in streaming architectures, CDC pipelines, and data observability frameworks.
- Proficient in deploying containerized applications (Docker, Kubernetes, ECS).
- Familiarity with using AI Coding assistants like Cursor, Claude Code, or GitHub Copilot
Preferred Qualifications:
- Exposure to CI/CD pipelines, automated testing, and infrastructure-as-code for data workflows.
- Familiarity with streaming platforms (Kafka, Kinesis, Pulsar) and real-time analytics engines (Druid, Pinot, Rockset).
- Understanding of data governance, lineage tracking, and compliance requirements in a multi-tenant SaaS platform.
If you’re passionate about cyber risk, thrive in a fast-paced environment, and want to be part of a team that’s redefining security—we want to hear from you! 🚀
What you need to know about the Bengaluru Tech Scene
Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.


