Western Digital Logo

Western Digital

Principal Engineer, Data Analytics Engineering

Posted 2 Days Ago
Be an Early Applicant
Bengaluru, Karnataka
Expert/Leader
Bengaluru, Karnataka
Expert/Leader
The Principal Engineer, Data Analytics Engineering at Western Digital will focus on designing and implementing robust data pipelines for large-scale data processing. Responsibilities include managing databases, developing applications using Python and Spark, and integrating real-time monitoring tools. The role also emphasizes server administration and optimizing data workflows for various operational requirements across the organization.
The summary above was generated by AI

Company Description

At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible. At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, weve been doing just that. Our technology helped people put a man on the moon.

We are a key partner to some of the largest and highest growth organizations in the world. From energizing the most competitive gaming platforms, to enabling systems to make cities safer and cars smarter and more connected, to powering the data centers behind many of the worlds biggest companies and public cloud, Western Digital is fueling a brighter, smarter future.

Binge-watch any shows, use social media or shop online lately Youll find Western Digital supporting the storage infrastructure behind many of these platforms. And, that flash memory card that captures and preserves your most. precious moments Thats us, too.

We offer an expansive portfolio of technologies, storage devices and platforms for business and consumers alike. Our data-centric solutions are comprised of the Western Digital, G-Technology, SanDisk and WD brands.

Todays exceptional challenges require your unique skills. Its You & Western Digital. Together, were the next BIG thing in data.

ABOUT ADVANCED ANALYTICS OFFICE (AAO) and BIG DATA PLATFORM (BDP) AAO is missioned with accelerating Analytics solutions at scale across the Enterprise to rapidly capture business value. These solutions target key business metrics, such as, reducing manufacturing cost, improve capital efficiency, reduce time-to market to develop new products, improve operational efficiency, and improve customer experience. These solutions are built using cutting edge Industrial 4.0 technologies and are delivered through a platform approach to enable rapid scaling. The solutions span AI/ML for improving manufacturing yield, quality, equipment uptime, and adaptive testing, Operations Research for capacity and scheduling optimization, Digital Twin for inventory and logistics optimization, and Product Telematics for managing customer fleet management solutions.

Big data platform, BDP team provides self-service data and application platforms enabling rapid scaling of services to make ever increasing business impact. You will have the opportunity to partner in making remarkable things happen across WDTs more than dozen factories across the globe, global product development teams, customer solutions, and supporting operations like Finance, supply chain, Procurement, Sales etc

Job Description

We are seeking a passionate candidate dedicated to building robust data pipelines and handling large-scale data processing. The ideal candidate will thrive in a dynamic environment and demonstrate a commitment to optimizing and maintaining efficient data workflows. The ideal candidate will have hands-on experience with Python, MariaDB, SQL, Linux, Docker, Airflow administration, and CI/CD pipeline creation and maintenance. The application is built using Python Dash, and the role will involve application deployment, server administration, and ensuring the smooth operation and upgrading of the application.

Key Responsibilities:

  • Minimum of 9+ years of experience in developing data pipelines using Spark.
  • Ability to design, develop, and optimize Apache Spark applications for large-scale data processing.
  • Ability to implement efficient data transformation and manipulation logic using Spark RDDs and Data Frames.
  • Manage server administration tasks, including monitoring, troubleshooting, and optimizing performance. Administer and manage databases (MariaDB) to ensure data integrity and availability.
  • Ability to design, implement, and maintain Apache Kafka pipelines for real-time data streaming and event-driven architectures.
  • Development and deep technical skill in PythonPySparkScala and SQL/Procedure.
  • Working knowledge and understanding on Unix/Linux operating system like awk, ssh, crontab, etc.,
  • Ability to write transact SQL, develop and debug stored procedures and user defined functions in python.
  • Working experience on Postgres and/or Redshift/Snowflake database is required.
  • Exposure to CI/CD tools like bit bucket, Jenkins, ansible, docker, Kubernetes etc. is preferred.
  • Ability to understand relational database systems and its concepts.
  • Ability to handle large table/dataset of 2+TB in a columnar database environment.
  • Ability to integrate data pipelines with Splunk/Grafana for real-time monitoring, analysis, and Power BI visualization.
  • Ability to create and schedule the Airflow Jobs.

Qualifications

  • Minimum of a bachelor’s degree in computer science or engineering. Master’s degree preferred.
  • AWS developer certification will be preferred.
  • Any certification on SDLC (Software Development Life Cycle) methodology, integrated source control system, continuous development and continuous integration will be preferred.

Additional Information

Because Western Digital thrives on the power of diversity and is committed to an inclusive environment where every individual can thrive through a sense of belonging, respect, and contribution, we are committed to giving every qualified applicant and employee an equal opportunity. Western Digital does not discriminate against any applicant or employee based on their protected class status and complies with all federal and state laws against discrimination, harassment, and retaliation, as well as the laws and regulations set forth in the "Equal Employment Opportunity is the Law" poster.

Top Skills

Pyspark
Python
Scala
SQL
HQ

Western Digital Bengaluru, Karnataka, IND Office

WMVV+J5H, Kadubeesanahalli, Bengaluru, Karnataka, India, 560103

Similar Jobs

Be an Early Applicant
An Hour Ago
Bangalore, Bengaluru, Karnataka, IND
Hybrid
15,000 Employees
Entry level
15,000 Employees
Entry level
Automotive • Professional Services • Software • Consulting • Energy • Chemical • Renewable Energy
The Laboratory Engineer will conduct product testing according to IEC and Indian standards, particularly involving medical products. They will manage projects, ensure compliance with ISO/IEC 17025, and collaborate with teams while handling test equipment and setups. Strong communication and project management skills are required.
Be an Early Applicant
2 Hours Ago
Bengaluru, Karnataka, IND
Remote
11,000 Employees
Senior level
11,000 Employees
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Engineer at Atlassian, you will build and ship features, code review for best practices, mentor teammates, ensure error monitoring of backend services, and participate in Agile development. You will also implement scalable RESTful micro-services and collaborate in a cross-geo distributed environment.
Be an Early Applicant
2 Hours Ago
Bengaluru, Karnataka, IND
Remote
11,000 Employees
Senior level
11,000 Employees
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Principal Software Engineer at Atlassian, you'll build and ship features in a distributed environment, mentor peers, review code for best practices, and ensure reliability in backend services while participating in Agile practices. You'll work with various modern programming languages and database technologies and engage in collaboration with cross-functional teams.

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account