Looking for a

Databricks Engineer

POS-241
Location: Remote
Type: Full-time
Seniority: Senior

About Us:

As a Senior Databricks Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers who strive to develop and deliver the highest quality products into the market.

 

Technical Requirements:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Over five years of experience designing and building scalable data solutions using Databricks, Apache Spark (PySpark/Scala), and Delta Lake.
  • Solid understanding of modern data platforms, including Lakehouse and Data Mesh architectures.
  • Strong command of SQL and Python, with proven experience in distributed data processing.
  • Practical knowledge of cloud infrastructure, including AWS, Azure, or GCP.
  • Experience with CI/CD pipelines, Git-based version control, and DataOps methodologies.
  • In-depth expertise in data governance, security, and metadata management practices.
  • Demonstrated leadership skills through team management or acting as a technical lead.
  • Minimum Upper Intermediate English (B2) or Proficient (C1).

 

Tasks and Responsibilities:

  • Lead the full lifecycle of data pipeline development, covering both batch and real-time processing using Databricks and Apache Spark.
  • Design and implement Lakehouse-based solutions that align with business goals and technical specifications.
  • Collaborate closely with data scientists, analysts, and engineers to enable robust data and machine learning functionalities.
  • Define and uphold coding standards and optimization strategies across Databricks workflows.
  • Develop real-time streaming pipelines using Spark Structured Streaming and Kafka.
  • Enhance Delta Lake data models and ensure governance through tools like Unity Catalog or Atlan.
  • Oversee integrations across cloud environments (AWS, Azure, GCP) leveraging native and third-party connectors.
  • Participate in code reviews and provide mentorship to junior team members.
  • Automate infrastructure provisioning using tools such as Terraform and the Databricks CLI.
  • Maintain platform scalability, compliance, and security.

 

Soft Skills:

  • Responsibility
  • Proactivity
  • Flexibility
  • Great communication skills
Join us

Ready to be part of our team?

Tell us what you're working on—we’ll help you design, scale, and deliver AI-powered software that drives real business outcomes.
Thank you!
Your message has been sent.
We will review it shortly and get back to you.