Amagi Logo

Amagi

Data Engineer III

Job Posted 4 Days Ago Posted 4 Days Ago
Be an Early Applicant
Bengaluru, Karnataka
Senior level
Bengaluru, Karnataka
Senior level
Responsible for building a scalable data pipeline framework, ensuring timely delivery of engineering architecture, collaborating with teams, and mentoring engineers. Lead design discussions and set up best practices for the team.
The summary above was generated by AI

Description

About Amagi

We are a next-generation media technology company that provides cloud broadcast and targeted advertising solutions to broadcast TV and streaming TV platforms. Amagi enables content owners to launch, distribute, and monetize live linear channels on Free Ad-supported Streaming TV and video services platforms. Amagi also offers 24x7 cloud-managed services bringing simplicity, advanced automation, and transparency to the entire broadcast operations. Overall, Amagi supports 700+ content brands, 800+ playout chains, and over 2500 channel deliveries on its platform in over 40 countries. Amagi has a presence in New York, Los Angeles, Toronto, London, Paris, Melbourne, Seoul, Singapore, and broadcast operations in New Delhi, and an innovation centre in Bangalore.

For more information visit us at

Amagi Monetise

Amagi Monetise group focuses on building products that help in monetisation for our customers in different streaming segments – FAST (Free Ad-supported Streaming TV), VoD (Video on Demand) and Live Events. This group consists of various products like 

Amagi Data Platform is the central data platform for Amagi and enables various use cases like Analytics, ML, and offers critical insights across content, advertising, billing etc. to the customers. It is a highly scalable platform which ingests multiple TBs of data per day and makes them available to the end user in near real time. 

Team

The team is responsible to build the New Dataplatform from scratch to enrich the Amagi product portfolio to enable customers with the highly informative data analytics of the streaming information of their channel, platform and deliveries across regions and devices. An Insightful dashboard to showcase the trending analytics of various metrics across channel viewership, content analytics and Ads for both linear as well as VOD channels which is made possible through crunching millions of viewership hours from TBs of viewers heartbeat log. Create efficient, cost effective, scalable and manageable data pipelines to build strongly typed data models to quickly serve millions of data points to the viewport.

Role reporting into: Director, Data

Location: Bangalore, India

Key Responsibilities:

  • Take complete ownership and accountability of feature requirements from conception till delivery and continue to manage, sustain and optimize the system.
  • Build, deploy and maintain a highly scalable data pipeline framework which enables our developers to build multiple data pipelines from different kinds of data sources.
  • Collaborate with the product, business, design and engineering functions to be on top of your team’s deliverables & milestones.
  • Timely delivery of highly reliable and scalable engineering architecture, and high quality, maintainable and operationally excellent code for your team.
  • Lead design discussions and code reviews.
  • Set up best practices, gatekeeper, guidelines and standards in the team.
  • Identify and resolve performance and scalability issues.
Requirements

Must haves

    • Bachelor’s/master’s degree in Computer Science with 6+ years of overall experience
    • Excellent technical skills and communication skills to mentor the engineers under you. 
    • Data platform knowledge from ingestion, processing, warehousing and deep expertise in any of the area
    • Deep understanding of ETL frameworks eg. Spark, or equivalent systems.
    • Deep understanding of OLAP systems and data modeling like star and snowflake schema.
    • Deep understanding of at least one of ETL technologies like Airflow, Databricks, Trino, Presto, Hive.
    • Building observability with technologies like logging, datadog, prometheus, sentry, grafana, splunk, EKS etc.
    • Strong Experience in Scala or Python
    • Strong knowledge in public clouds (AWS, GCP etc.) is preferred.
    • Must have experience: Technical leadership roles of 2+ years
  • Databricks knowledge is good to have 
  • Experience in architecting, designing and building scalable big data analytics pipelines ingesting TB of data per day.
  • Experience in optimizing spark, sql, datapipeline and troubleshooting.
  • Strong debugging skills to find RCA, workaround, resolution, long and short term mitigation.
  • Strong experience in Agile development methodologies to plan, break down, estimate feature requirements in EPIC, stories and subtask. 
  • Delegate work to the team and unblock them.Raise and mitigate risk. 
  • Code and test strategy reviews. Set up strong practice in the team to follow coding standards and testing.
  • Ability to work independently and with cross team dependencies. Ability to build relationships and do conflict resolution.
  • Good to have (preferably): At least 2+ years of experience in Ad tech or media or streaming.

Top Skills

Airflow
AWS
Databricks
Datadog
Eks
GCP
Grafana
Hive
Presto
Prometheus
Python
Scala
Sentry
Spark
Splunk
Trino

Amagi Bengaluru, Karnataka, IND Office

4th Floor, Raj Alkaa Park, Kalena Agrahara, Bannerghatta Road, Bengaluru, Karnataka, India, 560076

Similar Jobs

2 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Data Engineer III, you'll design and deliver data solutions, maintain data pipelines, and ensure data protection, supporting business objectives.
Top Skills: Db2IspfJclMainframe Z/OsNoSQLSQLTso
10 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Software Engineer III, you'll design and deliver tech solutions, contribute to application development, and enhance coding practices within an agile team.
Top Skills: Agile MethodologiesArtificial IntelligenceCi/CdCloud TechnologiesFront-End TechnologiesMachine LearningSoftware Engineering
15 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As a Data Engineer III, you will design and implement scalable data solutions, develop ETL jobs, and maintain high-quality production code while collaborating within an agile team.
Top Skills: Amazon RedshiftAWSEmrGCPGoogle BigqueryHadoopHiveJavaKubernetesAzurePythonScalaSnowflakeSpark

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account