Cargill Logo

Cargill

Data Engineer, Data Analytics & Reporting - Ag & Trading

Posted 13 Days Ago
In-Office
Bengaluru, Bengaluru Urban, Karnataka
Mid level
In-Office
Bengaluru, Bengaluru Urban, Karnataka
Mid level
The Data Engineer will design and maintain data systems, develop data products, support data pipelines, and collaborate with teams to meet data requirements.
The summary above was generated by AI
Job Purpose and Impact
  • The Professional, Data Engineering job designs, builds and maintains moderately complex data systems that enable data analysis and reporting. With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision making.

Key Accountabilities
  • DATA & ANALYTICAL SOLUTIONS: Develops moderately complex data products and solutions using advanced data engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
  • DATA PIPELINES: Maintains and supports the development of streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
  • DATA SYSTEMS: Reviews existing data systems and architectures to implement the identified areas for improvement and optimization.
  • DATA INFRASTRUCTURE: Helps prepare data infrastructure to support the efficient storage and retrieval of data.
  • DATA FORMATS: Implements appropriate data formats to improve data usability and accessibility across the organization.
  • STAKEHOLDER MANAGEMENT: Partners with multi-functional data and advanced analytic teams to collect requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
  • DATA FRAMEWORKS: Builds moderately complex prototypes to test new concepts and implements data engineering frameworks and architectures to support the improvement of data processing capabilities and advanced analytics initiatives.
  • AUTOMATED DEPLOYMENT PIPELINES: Implements automated deployment pipelines to support improving efficiency of code deployments with fit for purpose governance.
  • DATA MODELING: Performs moderately complex data modeling aligned with the datastore technology to ensure sustainable performance and accessibility.

Qualifications
  • Minimum requirement of 3 years of relevant work experience. Typically reflects 3 years or more of relevant experience.
  • Big Data Technologies: Hands-on experience with the Hadoop ecosystem (HDFS, Hive, MapReduce) and distributed processing frameworks like Apache Spark (including PySpark and Spark SQL) for large-scale batch and streaming workloads.
  • Programming Expertise: Strong proficiency in Python (data manipulation, orchestration, and automation), Scala(Spark-based development), and advanced SQL (window functions, CTEs, query optimization) for high‑volume analytical queries.
  • Data Pipeline Development: Proven ability to design, build, and optimize ETL/ELT pipelines for batch and real-time ingestion using tools/frameworks such as Spark Structured Streaming, Kafka Connect, Airflow/Azure Data Factory, or Glue, with robust error handling, observability, and SLAs.
  • Cloud & Data Warehousing: Hands-on with modern data warehouses like Snowflake & Lakehouse Architecture.
  • Transactional Data Systems: Experience with transaction management (isolation levels, locking, concurrency), backup/restore, replication (logical/physical), and high availability (Patroni, PgBouncer, read replicas).
  • Data Governance & Security: Understanding and implementation of data quality frameworks (DQ checks, Great Expectations/Deequ), metadata management (Glue/Azure Purview), role-based access control and row/column-level security, encryption, and compliance-aligned data handling (PII masking, auditability).

Preferred Skills
  • Experience with Apache Kafka or similar platforms for real-time data streaming.
  • Exposure to CI/CD pipelines, containerization (Docker), and orchestration tools (Kubernetes) for data workflows.
  • Understanding of supply chain analytics, commodity trading data flows, and risk management metrics (ideal for agri commodities industry).
  • Ability to collaborate with data scientists on predictive modeling and machine learning pipelines.

Top Skills

Hadoop,Hdfs,Hive,Mapreduce,Apache Spark,Pyspark,Spark Sql,Python,Scala,Sql,Spark Structured Streaming,Kafka Connect,Airflow,Aws Data Factory,Glue,Snowflake,Patroni,Pgbouncer

Cargill Bengaluru, Karnataka, IND Office

Cargill’s office is part of a major IT hub in Bengaluru, offering top-tier amenities, including a vibrant cafeteria, sports facilities, and digitalized services. Convenient housing and dining options, and robust transportation make the location a convenient place to live and work.

Bengaluru Office

Cargill’s Bengaluru India Capability Center (ICC) plays a vital role in advancing the company’s global strategy. As a hub for digital and data innovation, the ICC is a strategic engine powering growth with customers, strengthening core operations, and exploring new and emerging markets. The Bengaluru team exemplifies Cargill’s commitment to people, purpose, and collaboration. The team’s work is central to bringing Cargill’s digital strategy to life and shaping the future of food and agriculture.

Similar Jobs at Cargill

Yesterday
In-Office
Bengaluru, Bengaluru Urban, Karnataka, IND
Mid level
Mid level
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
The Application Developer maintains and implements software applications, manages configurations, conducts testing, and supports users while collaborating with stakeholders and vendors.
Top Skills: EdiJIRAQtestS/4HanaSap SdServicenowTosca
Yesterday
In-Office
Bengaluru, Karnataka, IND
Mid level
Mid level
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
The Senior Application Developer builds and supports software applications, ensuring timely implementation and responding to user needs. Responsibilities include application configuration, development, support, and vendor management.
Top Skills: ItcLongviewOnesourceTax Cube
Yesterday
In-Office
Bengaluru, Karnataka, IND
Mid level
Mid level
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
Designs, develops and maintains software applications, collaborates with teams, leads automation processes, performs testing, and provides technical support.
Top Skills: GoJavaJavaScriptReact

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account