At@Capitole,we keep growing, and we want you to be part of the family!In this project, we're working with an international company, and we are looking for aData/Devops Engineerto help them manage their Azure infrastructure. It involves working on the infrastructure layer, optimizing data workflows, automating deployments, and ensuring a robust and secure cloud environment.If you live outsideAlicante (Spain)or you are willing to relocate here, don't hesitate to apply.HYBRID- DevOps Data Engineer - Alicante (Spain).
Language:Full ENGLISHAs part of theData AI team, you will:Build and maintain the infrastructure that supports data pipelines and cloud environments, ensuring scalability, performance, and reliability.
Design and buildETL/ELTprocesses for ingesting and processing large datasets usingAzure Data Factory, Databricks, Synapseand other Azure services like eventhub.
Manage, monitor, and optimize data storage solutions such asAzure SQL Database, Azure Synapse Analytics, and Azure Data Lake Storage .
Participate in creation of scalabledata modelsthat support analytics and machine learning needs.
Integrate diverse data sources and formats, ensuring data quality and consistency for reliable business insights.
Optimize data pipelines and storage solutions for high performance and cost efficiency.
Work closely with cross-functional teams, including data scientists, architects, and developers, to understand and meet data requirements.
Maintain comprehensive documentation of data architectures, data flows, and pipeline configurations for transparency and knowledge sharing.
Design, implement, and managecontinuous integration and delivery pipelines.
Automate testing, build, and deployment workflows to ensure efficient delivery cycles.
Use tools likeTerraform, Ansible, or CloudFormationto define and deploy infrastructure.
Automate server provisioning and management
Set up monitoring tools to track system health and performance (e.g., Prometheus, Grafana, etc…).
Develop scripts and tools to automate repetitive tasks and improve efficiency using language such as Python, Bash or power shell.
Participate in the creation and maintenance ofData and AI modelpipelines to ensure continuous and reliable delivery of models to production.Technical skills :Proficiency with CI/CD tools (e.g., Jenkins, GitLab CI/CD, Azure devops).
SQL, Python, and Spark.
Data streaming technologies like (EventHub, Kafka etc…)
Data Lake Storage, DataBricks, and Blob Storage for data storage and transformation.
ETL/ELT development and data pipeline orchestration.
Scripting skills (Python, Bash, etc.) and monitoring tools (grafana, Prometheus etc..)
Azure DevOps /github and CI/CD pipelines.
Terraform, docker, Ansible and KubernetesWhy Capitole?We are great, but with you, we will be even greater. That is why you will have:-Individual training budget of €1200 (languages, books, certifications)
-Flexible working hours
-Flexible remuneration package
-Great discounts at sports centers (AndJoy)
- ⚕️ Free private health insurance
-Monthly follow-ups with your team to have continuous feedback
-Team Buildings every two months. You can't miss the Pool Party and our Christmas dinner
- ️ Discounts in international brands for employees (Club Capitole)You will have the opportunity to meet the whole family through our Technology communities, to share your knowledge and ideas. Knowledge exchange is key to us!Do not know us yet? Discover more here https://capitole-consulting.com/We are excited to meet you!The employee will adhere to the information security policies:
- He/she will have access to confidential information relating to Capitole and the project he/she is working on.
- Will have to comply with the security policies and internal policies of the company and client.
- You will have to sign an NDA.