Skip to main content

Data engineer (POS-143)

About Us:

As a Senior Data Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers who strive to develop and deliver the highest quality products into the market.

 

Technical Requirements:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Extensive experience with Snowflake, including schema optimization, query performance tuning, and use of advanced features like micro-partitioning, zero-copy cloning, and time travel.
  • Solid background with Amazon Redshift, including architecture understanding, performance tuning, and migration strategies to Snowflake.
  • Strong command of Microsoft SQL Server for legacy data management, complex SQL scripting, and stored procedure development.
  • Advanced SQL capabilities for cross-platform querying, data transformation, and issue resolution.
  • Skilled in designing both normalized and denormalized data models optimized for cloud-based analytics.
  • Proven experience in building and managing orchestration workflows using tools like Apache Airflow.
  • Deep understanding of modern data ingestion and transformation approaches, with hands-on ELT/ETL pipeline optimization.
  • Demonstrated ability to integrate and orchestrate diverse data sources for reliable data workflows.
  • Committed to enforcing high standards for data quality, including validation, monitoring, and error handling.
  • Practical experience implementing CI/CD pipelines for data projects using tools like Jenkins or Bitbucket.
  • Effective collaboration in Agile environments, working closely with multidisciplinary teams to meet iterative goals.
  • Strong problem-solving mindset with the ability to propose and implement innovative solutions.
  • Minimum Upper Intermediate English (B2) or Proficient (C1).

 

Tasks and Responsibilities:

  • Lead the transition of datamarts from Amazon Redshift and Microsoft SQL Server to Snowflake with minimal disruption and enhanced performance.
  • Plan and execute the phased decommissioning of outdated datamarts while ensuring data integrity.
  • Design scalable and efficient Snowflake-based data models to support robust analytical capabilities.
  • Refactor legacy pipelines to adopt modern ELT strategies for improved maintainability and speed.
  • Build and manage data workflows using orchestration tools to automate ingestion and transformation processes.
  • Define and uphold validation standards to guarantee consistent, high-quality data.
  • Integrate CI/CD methodologies to streamline development and deployment of data solutions.
  • Collaborate in Agile teams alongside engineers, analysts, and stakeholders to deliver incremental improvements.
  • Identify inefficiencies in data systems and implement enhancements to boost performance.
  • Maintain detailed documentation of architecture, pipelines, and processes to facilitate team knowledge sharing.

 

Soft Skills:

  • Responsibility
  • Proactivity
  • Flexibility
  • Great communication skills