Fluxon Logo

Fluxon

Data Engineer

Posted 3 Days Ago
Be an Early Applicant
Easy Apply
Remote
Hiring Remotely in India
Mid level
Easy Apply
Remote
Hiring Remotely in India
Mid level
As a Data Engineer, you will design and maintain data infrastructure, build data pipelines, and collaborate on data requirements for analytics and reporting.
The summary above was generated by AI
Who we are

We are Fluxon, a product development team founded by ex-Googlers and startup founders. We offer full-cycle software development: from ideation and design to build and go-to-market. We partner with visionary companies, ranging from fast-growing startups to tech leaders like Google and Stripe, to turn bold ideas into products with the power to transform the world.

This is a remote position, with a preference for candidates located in Hyderabad, Bangalore, or Gurgaon, India.

About the role

As the first Data Engineer at Fluxon, you’ll take the lead in designing, building, and maintaining the data infrastructure that powers our products and enables data-driven decision-making for our clients.

You'll be responsible for:

  • Design and implement data models and warehouse schemas to support analytics and reporting needs
  • Build and maintain reliable data pipelines to ingest, transform, and load data from various sources
  • Collaborate with product and engineering teams to understand data requirements and deliver scalable solutions
  • Ensure data quality, integrity, and accessibility across the organization
  • Optimizing query performance and improving the efficiency of existing data infrastructure
  • Maintain comprehensive documentation for data models, pipelines, and processes for team reference

You'll work with technologies including:

Data & Analytics

  • Data Warehouse: Google BigQuery, Snowflake, AWS Redshift, Databricks
  • ETL/Pipeline Tools: Apache Spark, Apache Airflow, dbt
  • Streaming & Queuing: Apache Kafka, Pub/Sub, RabbitMQ

Languages

  • SQL
  • Python (good to have)

Cloud & Infrastructure

  • Platforms: Google Cloud Platform (GCP) or Amazon Web Services (AWS)
  • Storage: Google Cloud Storage (GCS) or AWS S3
  • Orchestration & Processing: Cloud Composer (Airflow), Dataflow, Dataproc

Data Stores

  • Relational: PostgreSQL

Monitoring & Observability

  • GCP Cloud Monitoring Suite
Qualifications
  • 3–5 years of industry experience in data engineering roles
  • Strong proficiency in SQL and experience with data warehousing concepts (dimensional modeling, star/snowflake schemas)
  • Experience building and maintaining ETL/ELT pipelines
  • Familiarity with cloud data platforms, preferably GCP and BigQuery
  • Understanding of data modeling best practices and data quality principles
  • Solid understanding of software development practices including version control (Git) and CI/CD
Nice to have:
  • Experience with Python for data processing and automation
  • Experience with Apache Spark or similar distributed processing frameworks
  • Familiarity with workflow orchestration tools (Airflow, Prefect)
  • Exposure to dbt or similar transformation tools
What we offer
  • Exposure to high-profile SV startups and enterprise companies
  • Competitive salary
  • Fully remote work with flexible hours
  • Flexible paid time off
  • Profit-sharing program
  • Healthcare
  • Parental leave that supports all paths to parenthood, including fostering and adopting
  • Gym membership and tuition reimbursement
  • Hands-on career development

Top Skills

Amazon Web Services
Apache Airflow
Apache Kafka
Spark
Aws Redshift
Aws S3
Cloud Composer
Databricks
Dataflow
Dataproc
Dbt
Gcp Cloud Monitoring Suite
Google Bigquery
Google Cloud Platform
Google Cloud Storage
Postgres
Pub/Sub
Python
RabbitMQ
Snowflake
SQL

Similar Jobs

2 Hours Ago
Remote or Hybrid
India
Junior
Junior
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The role involves supporting data ingestion and curation processes, building data pipelines, and collaborating on analytics solutions using various big data tools under guidance.
Top Skills: AdlsAzureAzure Data FactoryCosmos DbDatabricksDelta LakeEvent HubFlumeHdfsHiveJavaKafkaPythonScalaSpark
2 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
As a Senior Software Engineer on the Data Platform team, you'll design and build foundational data services, maintain data pipelines, and ensure security and observability across data systems.
Top Skills: AirflowCloud Data WarehouseData LakeGoJavaKafkaPythonSparkSQL
22 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Artificial Intelligence • Fintech • Hardware • Information Technology • Sales • Software • Transportation
As a Business Intelligence Data Engineer, you'll develop scalable data architectures and models, manage data pipelines, and enhance analytics using AI tools.
Top Skills: AirbyteAirflowAWSBigQueryDatabricksDbtFivetranPythonRedshiftRetoolSnowflakeSQLTableau

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account