Skip to main content

Databricks Engineer (POS-181)

About Us:

As a Senior Databricks Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers who strive to develop and deliver the highest quality products into the market.

 

Technical Requirements:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Over 7 years of professional experience in Data Engineering or Big Data roles.
  • At least 5 years of practical experience working with Databricks, Apache Spark (using PySpark or Scala), and Delta Lake.
  • Solid understanding of contemporary data architectures, including Lakehouse and Data Mesh.
  • Skilled in SQL and Python, with a strong foundation in distributed data processing.
  • Hands-on experience with major cloud platforms such as AWS, Azure, or GCP.
  • Familiar with CI/CD pipelines, version control using Git, and DataOps methodologies.
  • In-depth knowledge of data governance, cataloging, and security frameworks.
  • Demonstrated ability to lead technical initiatives or act as a technical reference within a team.
  • Minimum Upper Intermediate English (B2) or Proficient (C1).

 

Tasks and Responsibilities:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Oversee the complete lifecycle of data solutions, including the development of ETL/ELT workflows and batch/streaming pipelines using Databricks and Apache Spark.
  • Architect Lakehouse-based data solutions aligned with business goals and technical specifications.
  • Collaborate closely with analysts, data scientists, and fellow engineers to deliver robust data and machine learning capabilities.
  • Define and uphold coding standards, best practices, and performance tuning for Databricks workflows.
  • Develop real-time streaming solutions with Spark Structured Streaming and Kafka.
  • Enhance Delta Lake data models and enforce data governance through tools such as Unity Catalog or Atlan.
  • Coordinate data integration across multi-cloud environments using both native and third-party connectors.
  • Participate in code reviews and provide mentorship to junior team members.
  • Automate infrastructure setup and deployment through tools like Terraform and the Databricks CLI.
  • Maintain a secure, compliant, and scalable data platform.

 

Soft Skills:

  • Responsibility
  • Proactivity
  • Flexibility
  • Great communication skills