For one of the biggest global insurance companies based in Madrid, I am currently and urgently looking for a Senior Data Engineer to join their new European Digital Hub and lead their global reports based in Switzerland.
Tasks:
As an Azure Databricks Developer within Asset Management, you will play a key role in designing, developing, and optimizing data pipelines and workflows. Your primary responsibility will be implementing scalable and high-performance data solutions using Azure Databricks, Airflow, Unity Catalog, Data Factory, and Git. You will collaborate closely with data engineers, architects, and business stakeholders to deliver high-quality data solutions that support analytics, reporting, and operational processes.
Offer:
1. Perm contract with final client, full time, hybrid mode
2. Attractive salary (in line with your experience) + bonus + package
3. A dynamic, international and challenging work environment
Requirements:
1. Solid experience with Apache Spark, PySpark, or Scala, with hands-on experience in large-scale data processing.
2. Strong knowledge of Azure Data Services
3. Experience with Apache Airflow
4. Solid understanding of data modeling, ETL processes
If you are interested, please do not hesitate to send me your updated CV and spread the word!
Please include your first and last name.
Phone:
Please include your country code.
CV / Resume:
#J-18808-Ljbffr