MillerKnoll Logo

MillerKnoll

Senior Application Developer – Data Engineering

Posted 12 Days Ago
Be an Early Applicant
Bengaluru, Bengaluru Urban, Karnataka
Senior level
Bengaluru, Bengaluru Urban, Karnataka
Senior level
The Senior Application Developer - Data Engineering will design, develop and validate data pipelines using cloud technologies. Responsibilities include collaborating with stakeholders, ensuring data quality, supporting data warehousing needs, and troubleshooting technical issues. The role requires knowledge of ETL processes, data modeling, and managing source code for data pipelines.
The summary above was generated by AI

Why join us?


Our purpose is to design for the good of humankind. It’s the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone.

We are looking to hire a Senior Application Developer - Data Engineering, with excellent technical and communication skills, to effectively designing, developing, and validating data pipelines using cloud technologies while collaborating with IT and business stakeholders to understand their needs and develop functionality and enhancements. A Senior Application Developer - Data Engineering in his/her day-to-day operations, will monitor production data pipelines to ensure timely and successful execution.

This role transcends organizational and geographical boundaries as it aims at supporting and enabling the various divisions of the Herman Miller business across the globe. The ideal candidate should understand the software development lifecycle and use agile methodology to design, develop, test, and implement solutions that deliver on end-user needs.

Responsibilities

  • Translate stakeholder deliverables into the project methodology deliverables that result in design, such as functional specifications, use cases, workflow & process diagrams, data flow & data model diagrams.
  • ETL design, development and support of data pipelines used for Analytics.
  • Perform table and interface design, development of logical and physical data model/database and ETL processes while meeting business/user requirements.
  • Design, develop, deploy, and validate extremely reliable, scalable, and high performing data pipelines.
  • Ensure quality of the data and code using quality tools and frequent code reviews.
  • Assist in the development of logical and physical data models for designing/developing Business Intelligence & Data Warehouse requirements.
  • Ensure proper source code management of data pipelines, stored procedures, and database definitions.
  • Provide accurate documentation of data pipelines.
  • Disseminate tasks amongst team members and ensure timely/accurate completion as defined by the business.
  • Coordinate prioritized work and help to mitigate any blockers for the team.
  • Efficient communication of known issues/problems with team members as well as appropriate escalation to leadership.

Essential experience

  • Demonstrated experience in using a graphical ELT/ETL pipeline development.
  • Experience with data analysis, data modeling and data warehousing is a must.
  • Ability to troubleshoot technical and functional problems with intuitive problem-solving techniques with little supervision or direction.
  • Experience in optimizing databases, understanding of referential integrity and the use of indexes.
  • Possess an extremely sound understanding of columnar database.
  • Strong SQL knowledge and experience writing queries and DML.
  • Moderate knowledge of cloud-based tools and how to interact with them.

Ideal candidate

  • A graduate / post-graduate in computer science / technology / engineering or mathematics.
  • Having excellent interpersonal and communication skills in English, both written and verbal.
  • At least 7+ years of strong hands-on experience with Data Engineering expertise using ELT/ETL tools to meet business requirements for analytics/reporting.
  • Strong experience on parameterization and process optimization.
  • Experience using Matillion, AWS, Python and Snowflake to develop robust data pipelines.
  • Ability to work individually or within a team environment.
  • Ability to participate in multiple projects & tasks and priorities in a healthy work environment.
  • Has attention to detail and a penchant for quality.
  • Proficient in documentation as well as process and workflow design.
  • Command of version control methodologies, preferably with GIT / GITHub.
  • Ability to take direction, constructive criticism, and work to specified deadlines.
  • Adhere to process and procedures defined for the role, the team, and the organization.
  • This role requires the individual to work in the extended UK shift (2 PM – 11 PM India time) and during critical issues, releases, updates, migration or other urgent business needs, there could be a requirement to work during US business hours (5:30 PM to 2:30 AM).

Who We Hire?


Simply put, we hire everyone. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We’re committed to equal opportunity employment, including veterans and people with disabilities.

MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at [email protected].

Top Skills

SQL

Similar Jobs

Be an Early Applicant
2 Days Ago
Bengaluru, Karnataka, IND
Hybrid
289,097 Employees
Mid level
289,097 Employees
Mid level
Financial Services
As a Software Engineer III in Data Engineering, you will enhance complex systems through coding and cloud management. You'll collaborate with teams to improve application infrastructure, implement best practices for reliability and scalability, and actively engage in problem-solving within a cross-functional team environment.
Be an Early Applicant
4 Days Ago
Bangalore, Bengaluru Urban, Karnataka, IND
4,381 Employees
Senior level
4,381 Employees
Senior level
News + Entertainment • Sports • eSports
The Data Engineer-III will design and develop scalable data pipeline solutions, ensuring data integrity and consistency while working in a fast-paced agile environment. Responsibilities include writing code in Spark and Python, developing streaming applications, scaling data infrastructure, and collaborating with senior engineers to enhance data systems.
Be an Early Applicant
5 Days Ago
Bangalore, Bengaluru, Karnataka, IND
7,456 Employees
Senior level
7,456 Employees
Senior level
Software • Consulting
Seeking a Lead or Senior Technical Lead Data Engineer responsible for overseeing data engineering projects, mentoring junior team members, and ensuring the integrity and efficiency of data systems. Will work on designing and implementing data pipelines and architectures.

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account