Looking for a

Data Engineer

POS-252
Location: Remote
Type: Full-time
Seniority: Senior

About Us:

As a Senior Data Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers who strive to develop and deliver the highest quality products into the market.

 

Technical Requirements:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Advanced SQL skills, with experience applying data warehousing principles.
  • Solid hands-on experience working with Azure Databricks and Spark.
  • Proven ability to build and manage data integration workflows using Azure Data Factory.
  • Proficient in Python, with the ability to create scalable data pipelines.
  • Deep understanding of key data engineering concepts, including data modeling, transformations, change data capture, and performance tuning.
  • Familiarity with Azure Data Lake for handling large datasets, managing Parquet/Delta tables, and optimizing queries.
  • Experience using version control tools and exposure to CI/CD pipelines.
  • Strong communication and interpersonal abilities, capable of raising concerns and collaborating in team discussions.
  • Self-motivated and independent, with a proactive attitude toward learning business processes and resolving critical issues.
  • Skilled at interpreting business needs without requiring constant guidance from subject matter experts.
  • Able to contribute effectively in planning and refinement meetings.
  • Minimum Upper Intermediate English (B2) or Proficient (C1).

 

Tasks and Responsibilities:

  • Design and implement data pipelines to extract, transform, and load information from multiple sources into the data warehouse using Python and notebooks.
  • Develop advanced SQL queries for extracting and manipulating data from the warehouse.
  • Build and maintain ETL processes using Azure Databricks and PySpark.
  • Create and manage data integration workflows through Azure Data Factory.
  • Work closely with developers, data analysts, and business stakeholders to gather requirements and deliver robust data solutions.
  • Optimize data pipeline performance and ensure system scalability and reliability.
  • Monitor data quality and resolve issues in collaboration with the operations team.
  • Maintain thorough documentation of data pipeline designs and implementations.
  • Promote coding best practices and actively communicate with the team on business logic transformations.

 

Soft Skills:

  • Responsibility
  • Proactivity
  • Flexibility
  • Great communication skills
Join us

Ready to be part of our team?

Tell us what you're working on—we’ll help you design, scale, and deliver AI-powered software that drives real business outcomes.
Thank you!
Your message has been sent.
We will review it shortly and get back to you.