Detalles Equipo Calendario Documento FAQ
Challenge

Expertise in large distributed Big Data Application deployment and implement ETL Processes (11774 )

Ranking: 0

  • We are searching for a professional with expert knowledge of:

    o Practice on Agile Scrum methodology.
    o Expertise in large distributed Big Data Application deployment and implement ETL Processes.
    o Good programming knowledge in Scala for Big data processing (Python is an extra)
    o Experience in data modeling processes to serve Business Users.
    o Experience with Maven / Gradle (artifact generation)
    o Demonstrable experience in big data solution design.
    o Proven experience using GIT
    o Experience with Azure DevOps, CI/CD and Artifacts
    o At least 3 years’ experience coding in Spark-Scala.
    o SQL databases (SQL Server or PostgreSQL , hive)
  • Proven experience Databricks (mainly Azure Databricks) implementing ETLs and any other data processing.
    • Experience monitoring processes in Azure Data Factory and Databricks
    • Proven experience Azure components: Azure Data Lake gen2, Azure Data Factory.
    • Basic knowledge of Delta Knowledge.
    • Basic knowledge of Linux systems.

     

    Responsibilities
    • Big Data Support (8 am - 17:30 p.m)
    • Monitoring of incidents arising from use cases
    • Incident resolution, at code and infrastructure level
    Profile
    • At least 1 years’ experience in Big Data projects
    • Expert and passionate about Big Data technologies
    • Proactivity and team player

About the Client:

  • Location: Work from anywhere in the EU.
  • Client: A leader in digital transformation in sectors such as banking, insurance, industrial, and automotive.
  • Technologies: Big Data, Blockchain, AI, Cloud, and more.
  • Global Presence: In over 15 markets.
  • Team: More than 8,000 professionals.
  • Experience: Over 35 years in the market.

Contractual Process and Relationship:

  • Your application will be sent to the technical and human resources department. If selected, you will meet with the technical team to learn about the project details. If it's a mutual fit, you will receive an offer to join us.
  • Based on the interview, a category is established and, depending on this, we make the economic proposal considering two options for this opportunity:
    • If you are a freelancer, between 25€ and 31€ per hour.
    • If you prefer a permanent contract, between €30,000 and €40,000 per year.
  • The process can last between 2 weeks and two months.

Expertise in large distributed Big Data Application deployment and implement ETL Processes (11774 )

Ranking: 0

  • We are searching for a professional with expert knowledge of:

    o Practice on Agile Scrum methodology.
    o Expertise in large distributed Big Data Application deployment and implement ETL Processes.
    o Good programming knowledge in Scala for Big data processing (Python is an extra)
    o Experience in data modeling processes to serve Business Users.
    o Experience with Maven / Gradle (artifact generation)
    o Demonstrable experience in big data solution design.
    o Proven experience using GIT
    o Experience with Azure DevOps, CI/CD and Artifacts
    o At least 3 years’ experience coding in Spark-Scala.
    o SQL databases (SQL Server or PostgreSQL , hive)
  • Proven experience Databricks (mainly Azure Databricks) implementing ETLs and any other data processing.
    • Experience monitoring processes in Azure Data Factory and Databricks
    • Proven experience Azure components: Azure Data Lake gen2, Azure Data Factory.
    • Basic knowledge of Delta Knowledge.
    • Basic knowledge of Linux systems.

     

    Responsibilities
    • Big Data Support (8 am - 17:30 p.m)
    • Monitoring of incidents arising from use cases
    • Incident resolution, at code and infrastructure level
    Profile
    • At least 1 years’ experience in Big Data projects
    • Expert and passionate about Big Data technologies
    • Proactivity and team player

About the Client:

  • Location: Work from anywhere in the EU.
  • Client: A leader in digital transformation in sectors such as banking, insurance, industrial, and automotive.
  • Technologies: Big Data, Blockchain, AI, Cloud, and more.
  • Global Presence: In over 15 markets.
  • Team: More than 8,000 professionals.
  • Experience: Over 35 years in the market.

Contractual Process and Relationship:

  • Your application will be sent to the technical and human resources department. If selected, you will meet with the technical team to learn about the project details. If it's a mutual fit, you will receive an offer to join us.
  • Based on the interview, a category is established and, depending on this, we make the economic proposal considering two options for this opportunity:
    • If you are a freelancer, between 25€ and 31€ per hour.
    • If you prefer a permanent contract, between €30,000 and €40,000 per year.
  • The process can last between 2 weeks and two months.

  • Equipo
  • Evaluador
  • Manager
  • Agencia
  • Cliente

GFT Cliente

Cliente

Comentarios: 0

Sandra Lobero

Agencia

Comentarios: 0

Paco Romero

Agencia

Comentarios: 0

Teba Gomez-Monche

Agencia

Comentarios: 0

Hugo Herrero

Manager

Comentarios: 0

Víctor Herrero

Evaluador

Comentarios: 10

claudia herrero

Agencia

Comentarios: 0