About Us:
As a Senior Data Engineer at Kenility, you’ll join a tight-knit family of creative developers, engineers, and designers who strive to develop and deliver the highest quality products into the market.
Technical Requirements:
- Bachelor’s degree in Computer Science, Software Engineering, or a related field.
- Over 5 years of experience in data engineering, data architecture, or a related discipline.
- Strong command of SQL and Python, with additional experience in PySpark considered a plus.
- Proven ability to design and implement ETL workflows using tools such as AWS Glue, Apache Airflow, or similar platforms.
- Solid understanding of data modeling techniques, including star schemas and slowly changing dimensions.
- Comfortable making technical decisions and driving initiatives independently within a small, agile team.
- Experience managing and optimizing cloud-based data infrastructure with a focus on cost efficiency, particularly in AWS services like Redshift and Glue.
- Strong analytical thinking and problem-solving skills.
- Excellent organizational abilities and a keen eye for detail.
- Skilled in time management and capable of handling multiple priorities.
- Minimum Upper Intermediate English (B2) or Proficient (C1).
Tasks and Responsibilities:
- Design and build a scalable data warehouse using Amazon Redshift, ensuring high performance and future growth.
- Create and maintain ETL pipelines with AWS Glue (Python/PySpark), integrating data from SQL Server (RDS) and other external sources.
- Organize and structure the S3-based data lake for optimal performance, partitioning, and seamless Redshift integration.
- Define and implement dimensional data models to support analytical and reporting requirements.
- Establish data governance standards, including documentation, data quality validations, and pipeline monitoring.
- Collaborate closely with BI teams to ensure data infrastructure aligns with reporting needs, such as CRM and dashboards.
- Monitor and manage cloud costs through cluster tuning, Glue job optimization, and workflow automation.
- Investigate and resolve performance issues across the data infrastructure.
- Contribute to the strategic evolution of the data platform by evaluating new tools, processes, and automation strategies.
- Document all aspects of data systems, including architecture, flows, and integration methodologies.
- Participate in project planning and estimation in coordination with technical and business stakeholders.
Soft Skills:
- Responsibility
- Proactivity
- Flexibility
- Great communication skills