Senior Data Engineer - France H/F - Shippeo Paris 10e - 75
- Bac +5
- Transport • Logistique
Our vision is to become the leading data platform for the freight industry. By harnessing our growing network, real-time data, and AI, we aim to help supply chains deliver exceptional customer service and achieve operational excellence.
The Data Intelligence Tribe is responsible for leveraging Shippeo's data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe's typical responsibilities are to :
- Get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions
- Extract the data they need, get direct access to IT or analyze IT directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking
- Provide best-in-class data quality by implementing advanced cleansing & enhancement rules
As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo's modern data stack that's composed of different technology blocks :
- Data Acquisition (Kafka, KafkaConnect, RabbitMQ),
- Batch data transformation (Airflow, DBT),
- Cloud Data Warehousing (Snowflake, BigQuery),
- Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.
Required :
- You have a degree (MSc or equivalent) in Computer Science.
- 3+ years of experience as a Data Engineer.
- Experience building, maintaining, testing and optimizing data pipelines and architectures
- Programming skills in Python
- Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.
- Working knowledge of message queuing and stream processing.
- Advanced knowledge of Docker and Kubernetes.
- Advanced knowledge of a cloud platform (preferably GCP).
- Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).
- Experience with Infrastructure as code (Terraform/Terragrunt)
- Experience building and evolving CI/CD pipelines (Github Actions).
Desired :
- Experience with Kafka and KafkaConnect (Debezium).
- Monitoring and alerting on Grafana / Prometheus.
- Experience working on Apache Nifi.
- Experience working with workflow management systems such as Airflow.
The Data Intelligence Tribe is responsible for leveraging Shippeo's data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe's typical responsibilities are to :
- Get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions
- Extract the data they need, get direct access to IT or analyze IT directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking
- Provide best-in-class data quality by implementing advanced cleansing & enhancement rules
As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo's modern data stack that's composed of different technology blocks :
- Data Acquisition (Kafka, KafkaConnect, RabbitMQ),
- Batch data transformation (Airflow, DBT),
- Cloud Data Warehousing (Snowflake, BigQuery),
- Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.
Required :
- You have a degree (MSc or equivalent) in Computer Science.
- 3+ years of experience as a Data Engineer.
- Experience building, maintaining, testing and optimizing data pipelines and architectures
- Programming skills in Python
- Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.
- Working knowledge of message queuing and stream processing.
- Advanced knowledge of Docker and Kubernetes.
- Advanced knowledge of a cloud platform (preferably GCP).
- Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).
- Experience with Infrastructure as code (Terraform/Terragrunt)
- Experience building and evolving CI/CD pipelines (Github Actions).
Desired :
- Experience with Kafka and KafkaConnect (Debezium).
- Monitoring and alerting on Grafana / Prometheus.
- Experience working on Apache Nifi.
- Experience working with workflow management systems such as Airflow.
Recommandé pour vous
- > Graphisme
- > Centre-Val de Loire
- > Achat
- > Hotellerie
- > Production
- > Corse
- > Communication
- > Direction
- > Occitanie
- > Aeronautique
- > Audiovisuel
- > Banque
- > Vente
- > Enseignement
- > Ferroviaire
- > Comptabilité
- > Agricole
- > Transport
- > Architecture
- > Recherche
- > Informatique
- > Culture
- > Outre Mer
- > Beauté
- > Electronique
- > Service
- > Provence-Alpes-Côte d'Azur
- > Santé
- > Ressources Humaines
- > Artisanat
- > Restauration
- > Chimie
- > Industrie
- > Service Public
- > Pub
- > Telecom
- > Gestion
- > Hauts-de-France
- > Agroalimentaire
- > Île-de-France
- > Pays de la Loire
- > Hospitalier
- > Nouvelle-Aquitaine
- > Ingénierie
- > SAV
- > Marketing
- > Grand Est
- > Automobile
- > Finance
- > Distribution
- > BTP
- > Administratif
- > Commerce
- > Bretagne
- > Formation
- > Biotechnologie
- > Edition
- > Immobilier
- > Bourgogne-Franche-Comté
- > Logistique
- > Défense
- > Sécurité
- > Secrétariat
- > Auvergne-Rhône-Alpes
- > Juridique
- > Nettoyage
- > Environnement
- > Nautisme
- > Qualité
- > Audit
- > Tourisme
- > Social
- > Normandie
- > Assurance