Principal Data Engineer – iGaming Industry
Location: Spain
Tech Stack: Apache Kafka, Python/Scala/Java, PostgreSQL, MySQL, Redis, Flink, ClickHouse, AWS/Azure/GCP
Project Overview:
Join a cutting-edge company in the iGaming industry, developing real-time data pipelines that power mission-critical applications. Our client operates at the intersection of cloud and on-premises infrastructures, delivering high-performance, scalable data solutions that support millions of transactions daily.
As a Principal Data Engineer, you will lead a team of data engineers in designing, developing, and optimizing stream processing architectures, ensuring reliability, scalability, and cost-efficiency. You will collaborate with architecture, product, and engineering teams to deliver innovative data-driven solutions that align with business objectives.
Requirements:
* 5+ years of hands-on experience in data engineering, specializing in Python, Scala, or Java.
* Deep expertise in Apache Kafka for real-time data streaming (mandatory).
* Proficiency in managing and optimizing databases (PostgreSQL, MySQL, MSSQL, Oracle).
* Experience working with analytical databases and distributed storage solutions.
* Strong knowledge of cloud platforms (AWS, Azure, Google Cloud) and on-premises data environments.
* Experience with Flink, Redis, RabbitMQ, Superset, Cube.js, Minio, and Grafana (optional but beneficial).
* Proven leadership and mentoring skills, with experience guiding a team of engineers.
* Strong focus on system reliability, scalability, and data integrity best practices.
* iGaming industry experience is a significant advantage.
Responsibilities:
* Lead and mentor a team of data engineers, providing technical guidance and architectural oversight.
* Own end-to-end development of scalable data pipelines and architectures for real-time analytics.
* Optimize data storage and processing using technologies like Apache Kafka, Redis, and ClickHouse.
* Migrate selected cloud-based solutions to on-premises environments, optimizing for performance and cost.
* Collaborate with stakeholders to define data strategies and ensure business alignment.
* Implement best practices for system reliability, fault tolerance, and high availability.
* Stay up to date with emerging technologies to drive innovation in data processing and analytics.