Descripción del trabajo
Would you like to be part of our new adventure?
Vodafone Group launched its new technological HUB in 2021, an international center of excellence dedicated to research and development of technical solutions, such as Secure Networks, 5G and 6G development, Open RAN, IoT, MPN & MEC and UCC for Vodafone Business, platforms and enterprise vacancy is part of VOIS Spain, legal entity part of Vodafone Group.
Come and join us to create the future together!
We are looking for a Big Data Engineer with experience in designing, building and managing applications to process large amounts of data in Google Cloud Platform. You’ll provide expert guidance to source and integrate structured and unstructured data from dozens of local data sources into a data lake.
As a Big Data Engineer, you will work closely with the team and our stakeholders to build and deliver solutions for a next generation Big Data analytics platform. You will have a focus on data streaming, data management, data quality, data security and deliver the systems that process huge volumes of data. Working with our Data Scientists to help drive the company strategy and marketing campaigns out to our customers.
At Vodafone, we don’t just produce creative products, we develop amazing people too. We are driven to empower people. We are committed to helping our people perform at their best and achieve their full potential.
What you Bring :
* Expertise in public cloud PaaS, CaaS and IaaS tools and environments, particularly with Google Cloud Platform (GCP).
* High Experience with the cloud services related with DWH and Big Data development. Preferable experience on Google Cloud Platform using its services such as Data Fusion, Data Flow, BigQuery etc.
* Expert level experience with Hadoop ecosystem (Spark, Hive / Impala, HBase, Yarn).
* Spanish and an Advanced level of English (International team).
* Strong software development experience in some of the following Java, Scala and / or Python programming languages; other functional languages desirable.
* Experience with Unix-based systems, including bash programming.
* Experience with other distributed technologies.
* Good exposure of Docker / Kubernetes.
* Infrastructure-as-code (IaC).
* Experience orchestrating pipelines using Airflow.
* Experience in SQL preferably knowledge of Big Query.
* Experience working on EDW or Data Warehouse solutions.
* Demonstrable knowledge and expertise in Jenkins, Gitlab / Github, Nexus or equivalent CI / CD tools.
* Worked in an Agile environment.
* Willing to learn.
Years of Experience : at least 2 years working in the technologies requested.
Nice to haves :
* Experience working with Terraform, Ansible, and a common scripting language (Bash, Python, Ruby, etc) is a plus.
* Experience building systems to perform real-time data processing using Kafka, or similar technologies like Pub / Sub is a plus.
* Experience with common SDLC, including SCM, build tools, unit testing, TDD / BDD, continuous delivery, and agile practices is a plus.
* Qualification / Education: Master’s degree in computer science, engineering, and / or other STEM degrees.
#J-18808-Ljbffr