At Altia, we have spent 30 years creating digital solutions prepared for the future, capable of generating real value and driving significant change.
We are driven by a clear purpose: to grow by enabling growth, and to do so in a sustainable and lasting way. We are convinced that we will only be significant if together we contribute a positive impact and everyone evolves in the process.
We are an international team of great professionals who, since 1994, have combined their energy and applied their innovative vision of technology to truly relevant projects, for organizations that are drivers of change.
With an end-to-end approach, we develop customized solutions and integrate products from leading manufacturers in the sector. Based on a broad range of services and products, we foster innovation and technological renewal.
We are expanding our Data team and looking for a Data Engineer with experience in building data ingestion pipelines and developing ETL !
We are looking for people with more than 5 years of experience in developing and implementing ETL jobs for data warehouses using SQL and Python, who are eager to continue learning and develop their professional career with us in a perfect environment for growth.
You will join a challenging project where your main mission will be the creation of ingestion pipelines from various data sources and leveraging the collected data to build aggregated Metrics Dashboards for domains and sub-domains.
Your responsibilities will include :
Creating ingestion pipelines from various data sources.
Developing and maintaining ETL processes using SQL and Python.
Building aggregated Metrics Dashboards.
Working with AWS or Azure Big Data tools (Glue, Athena, Redshift, Kinesis, Data bricks, azure analytics, Data explorer).
Using cloud-based storage and functions (S3, Blob, Lambda, Azure Functions).
Applying data engineering methodologies (Datawarehouse, data lake, star schema).
Requirements :
University degree in computer science, engineering, mathematics or a related field.
More than 5 years of experience in data engineering.
Experience in developing and implementing ETL jobs.
Experience with SQL and Python.
Experience with AWS or Azure Big Data tools.
Experience with cloud storage and functions.
Knowledge of Datawarehouse, data lake and star schema.
Advanced English
It will be positively valued :
AWS certifications (Solutions Architect, Data Analytics Specialty).
Experience with AWS Cloud Formation or Terraform.
Knowledge of CI/CD (AWS CodePipeline/CodeBuild/CodeDeploy).
Knowledge of data APIs.
Knowledge of information security.
Experience with Git, Jira and Bitbucket.
Familiarity with the Gartner ODM framework.
What we offer you
Remote work model
Flexible schedule and summer hours
Continuous training plan
You will be part of a well-established and experienced team
You will work on projects with major clients in the public and private sectors
Flexible and competitive compensation
Open culture