What you'll do with us

  • Design a data pipeline from the ground up;
  • Work closely with the data scientists to design data processes and optimize the availability of data for cloud and non-cloud based data services;
  • Work closely with our Software team in the development of data infrastructures;
  • Implement and monitor data quality indicators across the entire data pipeline;
  • Participate in the evolution of the Safeteams and klik dashboard solutions and help define features to improve the platform.

What you'll bring

  • 3+ years of experience designing and developing data pipelines, data cleansing routines utilizing typical data quality functions involving standardization, transformation, anomaly detection, etc.;
  • University degree in software engineering, computer science, or related field;
  • Proficiency in Python;
  • Proficiency in SQL, MongoDB and database design;
  • Experience with Apache Spark, Hadoop, Kafka or similar technologies;
  • Experience building data pipelines on Google Cloud Platform or AWS;
  • Proven ability to streamline and optimize data transformation processes;
  • Experience working on a platform to deliver a cloud-based data visualization service, a plus;
  • Team spirit and sense of responsibility towards the team;
  • Solid problem solving and disambiguation skills;
  • Excellent written and verbal communication skills in both French and English;
  • Able to work efficiently in a multifunctional team.

What we offer 

  • Flexible working hours
  • 4 weeks vacation
  • An attractive group insurance plan
  • A variety of social activities (within the new COVID reality)
  • A stimulating and inclusive company culture


We thank all applicants for their interest. However, only candidates selected for an interview will be contacted.