Roku Logo

Roku

Senior Software Engineer, Data, Ad Analytics

Posted Yesterday
Be an Early Applicant
Bengaluru, Karnataka
Senior level
Bengaluru, Karnataka
Senior level
The Senior Software Engineer will develop and maintain APIs using Java Spring Boot and handle big data technologies like Apache Spark and Airflow. Responsibilities include designing data pipelines, optimizing data systems, and collaborating with cross-functional teams for integrated solutions and mentorship.
The summary above was generated by AI
Teamwork makes the stream work.Roku is changing how the world watches TV

Roku is the #1 TV streaming platform in the US and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers.

From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines.


About the Team

You will join a dynamic team that specializes in processing and enriching large amounts of datasets from diverse sources with a focus on Advertising & Analytics. This team is responsible for developing systems that process data and generate comprehensive reports, actionable insights, and planning tools to support various business needs that includes enhancements of Advertising Campaigns and Reach metrics. The team is also responsible for building a generic Ad Analytics API frameworks that will be used by measurement tools built in-house as well as external 3rd party vendors.


About the Role 

We are looking for a highly skilled Senior Software Engineer who is proficient in API development with Java Spring Boot and has deep expertise in big data technologies, specifically Apache Spark, Apache Airflow, Apache Druid and Trino. This hybrid role requires a strong foundation in both software engineering and data engineering, enabling you to design, build, and maintain scalable and efficient systems for both application development and data processing. You will work closely with cross-functional teams to develop robust APIs and manage large-scale data systems.


What you'll be doing 

  • Big Data Engineering:
    • Design, build, and maintain data pipelines and ETL processes utilizing Apache Spark, Apache Airflow, Hive, Apache Druid, Trino, StarRocks.
    • Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance.
    • Develop and optimize complex queries and data processing jobs for large datasets.
    • Monitor, troubleshoot, and enhance data systems to ensure minimal downtime and high efficiency.
  • API Development:
    • Design, develop, and maintain high-quality APIs using Java Spring Boot.
    • Ensure the APIs are scalable, secure, and performant, following best practices in API design and microservices architecture.
    • Write clean, maintainable, and efficient code, and participate in code reviews to ensure the health of the codebase.
  • Collaboration & Mentorship:
    • Work closely with data scientists, software engineers, and other teams to deliver integrated solutions.
    • Collaborate with product managers, front-end developers, and other stakeholders to gather requirements and translate them into technical solutions.
    • Provide technical guidance and mentorship to junior engineers on both API development and data engineering best practices.


We're excited if you have

  • Extensive Software development and/or Data Engineering experience with a focus on API development, using Java, Scala, Spring Boot and Big Data technologies. 
  • Strong understanding of RESTful API design principles and microservices architecture. 
  • Expertise with Big data technologies such as Hadoop, Spark, Apache Airflow, Kafka, HiveApache Druid and Presto/Trino, with hands-on experience in deploying, managing, and optimizing these systems.
  • Experience with cloud platforms like (AWS, GCP) and containerization (Docker, Kubernetes).
  • Experience with CI/CD pipelines, DevOps practices, and infrastructure-as-code tools like Terraform.
  • Knowledge of data modeling, schema design, and data visualization tools.
  • Experience with distributed data processing, data warehousing, and real-time data processing.
  • Proficiency in SQL and experience with query optimization of large datasets.
  • Strong problem-solving skills, with the ability to work independently and as part of a team.
  • Bachelor’s degree in computer science, Engineering, or a related field, or equivalent experience.
  • Nice to have: 
    • Experience with real-time processing frameworks like Apache Flink.
    • Experience with datastores like StarRocks, DuckDB.
    • Used data visualization tools like Looker

#LI-PS2

Benefits

Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter.


The Roku Culture

Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. 

We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. 

To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet.

By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Top Skills

Java
Scala

Roku Challaghatta, Karnataka, IND Office

Embassy Golf Links Business Park, Challaghatta, Karnatka, India

Similar Jobs

Be an Early Applicant
An Hour Ago
Bangalore, Bengaluru, Karnataka, IND
Hybrid
15,000 Employees
Entry level
15,000 Employees
Entry level
Automotive • Professional Services • Software • Consulting • Energy • Chemical • Renewable Energy
The Laboratory Engineer will conduct product testing according to IEC and Indian standards, particularly involving medical products. They will manage projects, ensure compliance with ISO/IEC 17025, and collaborate with teams while handling test equipment and setups. Strong communication and project management skills are required.
Be an Early Applicant
2 Hours Ago
Bengaluru, Karnataka, IND
Remote
11,000 Employees
Senior level
11,000 Employees
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Engineer at Atlassian, you will build and ship features, code review for best practices, mentor teammates, ensure error monitoring of backend services, and participate in Agile development. You will also implement scalable RESTful micro-services and collaborate in a cross-geo distributed environment.
Be an Early Applicant
2 Hours Ago
Bengaluru, Karnataka, IND
Remote
11,000 Employees
Senior level
11,000 Employees
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Principal Software Engineer at Atlassian, you'll build and ship features in a distributed environment, mentor peers, review code for best practices, and ensure reliability in backend services while participating in Agile practices. You'll work with various modern programming languages and database technologies and engage in collaboration with cross-functional teams.

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account