SkyPoint Cloud Logo

SkyPoint Cloud

Data Engineer (Databricks / Scala / Python)

Sorry, this job was removed at 05:10 a.m. (IST) on Monday, Feb 17, 2025
Be an Early Applicant
Bangalore, Bengaluru, Karnataka
Bangalore, Bengaluru, Karnataka

Who we are:

 

Skypoint’s mission is to unify data integration, analytics, and AI into a cohesive SaaS offering powered by Data Lakehouse architecture. Our Generative AI Platform brings people and data together with a focus on enhancing healthcare, financial services and other regulated industries. We are proud partners of Microsoft, OpenAI, DataStax and Databricks.

 

Website: Skypoint.ai

 


Location: Global Technology Park, Bellandur, Bangalore, India. (5 days' work from office)


Here is what you can expect to work on in this critical role:


You will lead the efforts to leverage the data to its maximum value. Our platform processes billions of rows in data every month on behalf of millions of users.


How do our Data Engineers spend their time?


You can expect to spend about 50% building and scaling the Skypoint Lakehouse, data pipelines and about 20% of your time defining and implementing DataOps methodologies. 


Additionally, 20% of your time will be spent writing and optimizing queries and algorithms. Lastly, you’ll spend about 10% of your time supporting and monitoring pipelines.


Our team values collaboration, a passion for learning and a desire to become a master of your craft. We thrive in asynchronous communication. You will have a lot of support from leadership when you communicate proactively with detailed information about any roadblocks you may encounter.


Qualities of Data Engineers Who Thrive in This Role


🔥 You are a driven, self-starter type of person who isn’t afraid to dig for answers, stays up-to-date on industry trends and is always looking for ways to enhance your knowledge (yes, Databricks-related podcasts count! 🎧)


💡Your skill set includes a blend of Databricks-related technologies in Azure or AWS or GCP


🖥️ Experience with Python or Scala is a must! (you’ve got a software engineering hat)


💡 Working with Python, Scala, Spark (Databricks) interacting with Delta Lakehouse, Delta Live Tables and Unity Catalog


Skills & Experience Required:


💡3-5 years of industry experience

🔥 Spark (Scala or Python), Databricks

💡 Strong backend programming skills for data processing, with practical knowledge of availability, scalability, clustering, micro services, multi-threaded development and performance patterns.

-- Experience with the use of a wide array of algorithms and data structures.

-- Experience in workflow orchestration platforms like Databricks DLT/ADF/AWS Glue/Airflow etc.

-- Strong Distributed System fundamentals

-- Strong handle on REST APIs

-- Experience in NoSQL databases

-- Most recent work experience MUST include work on Scala or Python and Spark (Databricks)


Educational Level:


🔥 BS / BE / MS in Computer Science from a Top Tier school (IITs / RECs etc.)

Join Skypoint and Let's Soar Beyond Boundaries Together!

 

Ready to Surf the AI Revolution and Dive into Innovation? Armed with top-tier education and cutting-edge expertise, come aboard Skypoint, where we're not merely redefining technology – we're crafting the future with our revolutionary Generative AI Platform. If you're all set to embrace the extraordinary, seize this opportunity, and together, we'll conquer new horizons! Don't wait, Apply Now and Supercharge Your Career with Skypoint!

 

Skypoint is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

 

Similar Jobs

15 Hours Ago
2 Locations
Expert/Leader
Expert/Leader
eCommerce • Other • Retail
As a Principal Engineer on the Data Engineering team, you will define the strategy for Target's retail data platform, optimize platform performance, lead architectural decisions, mentor engineers, and collaborate across teams to enhance engineering practices and deliver quality solutions.
15 Hours Ago
Bangalore, Bengaluru, Karnataka, IND
Mid level
Mid level
Logistics • Other
As a Big Data Software Developer at CHEP, you will be responsible for developing software solutions, addressing incidents reported by users, performing root cause analysis, and collaborating with teams for effective problem resolution. Additionally, you will maintain communication with customers and provide feedback for product improvements.
2 Days Ago
Bangalore, Bengaluru, Karnataka, IND
Senior level
Senior level
Fintech • Information Technology • Payments
In this role, you'll design and build solutions for batch and real-time data as a Staff Data Engineer in the Data & AI Platform team at Visa. Responsibilities include implementing machine learning platforms, ensuring code quality, and optimizing ML processes. You'll work collaboratively in an Agile environment while leading projects and mentoring team members.

What you need to know about the Bengaluru Tech Scene

Dubbed the "Silicon Valley of India," Bengaluru has emerged as the nation's leading hub for information technology and a go-to destination for startups. Home to tech giants like ISRO, Infosys, Wipro and HAL, the city attracts and cultivates a rich pool of tech talent, supported by numerous educational and research institutions including the Indian Institute of Science, Bangalore Institute of Technology, and the International Institute of Information Technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account