Key features

Job summary

We are seeking a skilled and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable data pipelines to support advanced analytics and data-driven decision-making across the organization.

The ideal candidate is proficient in working with large-scale data architectures, has a deep understanding of ETL processes, and excels in creating efficient data solutions. You will collaborate with cross-functional teams to ensure data quality, consistency, and availability, leveraging a mix of on-premises and cloud-based data technologies.

Key responsibilities

  • Develop, test, and optimize ETL processes to ensure data integrity and efficient data flow across various sources and platforms.
  • Design and implement scalable data pipelines that meet both real-time and batch processing needs.
  • Collaborate with Data Scientists and Analysts to support their data needs by ensuring reliable and timely access to clean, structured data.
  • Troubleshoot, diagnose, and resolve data-related issues and ensure data availability.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • 3+ years of experience in data engineering, with a proven track record of working on large-scale data projects.
  • Proficiency in SQL and experience with various ETL tools (e.g., Apache Airflow, Talend, Informatica).
  • Familiarity with data warehousing solutions such as Snowflake, Redshift, or BigQuery.