Anunciado 18 de septiembre
Esta oferta no acepta candidaturas
Tipo de jornada
Sin especificar
Tipo de contrato
Sin especificar
Salario
Salario sin especificar
Estudios mínimos
Sin especificar
Nivel
Sin determinar
Número de vacantes
1
Número de inscritos
1
Tags Relacionados
Descripción del empleo
  • Multinational Company |Career Progression

One of the world's leading professional services companies, transforming clients' business, operating and technology models for the digital era.




  • Develop and optimize ETL processes by working closely with multiple data partners and stakeholders across the company.

  • Define technical requirements and data architecture for the underlying data warehouse.

  • Collaborate with subject-matter experts across different business units to design, implement and deliver insightful analytic solutions.

  • Mantain and improve big data architecture.

  • Experience with data integration toolsets and writing and mantaining ETL jobs.

  • An understanding of how the Big Data platform will benefit the business and experience dictating this to stakeholders.


  • Remote 100%

  • Good salary package

  • Carrer progression

  • Benefits



Requisitos mínimos
  • Degree in a quantitative field (Statistics, Mathematics, Physcs, Engineering, Computer Science...) Master's degree or PhD in any of these disciplines is a plus.

  • 3+ years relevant work experience in a similar positions.

  • Relational Databases proficiency is a must.

  • Strong Hands-on experience in implementing Data Lake with technologies like Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB and Purview.

  • Experience in using big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase or MongoDB, Neo4J, Elastic Search, Impala, Sqoop etc.

  • Strong programming & debugging skills either in Python and Scala/Java. Experience of building REST services is good to have.

  • Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner.

  • Good understand and Experience of using CI/CD with Git, Jenkins / Azure DevOps.

  • Experience in setting up cloud-computing infrastructure solutions.

  • Hands on Experience/Exposure to NoSQL Databases and Data Modelling in HiveAdept at DevOps & CI/CD methodologies.



Compartir esta oferta