Big Data Developer
Deutsche Bank AG
Bucharest, Romania
8 zile în urmă

Position Overview

What do you get out of this Role?

If you want to apply and enhance your expertise in Big Data, develop Scala Pipelines, offer Spark performance solutions interact with Elastic Search, Flume, Kibana, in an environment as vast as Deutsche Bank, this is the right time and job for you.

We use of the latest architectures and technologies in Big Data and Business Intelligence in order to offer relevant data analysis to financial departments and ensure regulatory commitments.

According to Deutsche Bank’s strategic approach to work with latest innovative solutions, the initial prototypes started to become the norm for many projects.

Your key responsibilities

  • Build distributed, reliable and scalable data pipelines to ingest and process data in real-time. Hadoop developer deals with fetching streams, transaction behaviors, clickstream data and other unstructured data.
  • Load data from different datasets and deciding on which file format is efficient for a task. Hadoop developers source large volumes of data from diverse data platforms into Hadoop platform.
  • Fine tune Hadoop applications for high performance and throughput.
  • Configure and maintain enterprise Hadoop environment.
  • Understand the requirements of input to output transformations.
  • Defining Hadoop Job Flows / Implement data workflows using Hadoop ecosystem.
  • Design and implement column family schemas of Hive / Impala within HDFS.
  • Assign schemas and create Hive / Impala tables.
  • Develop efficient Hive / Impala scripts with joins on datasets using various techniques.
  • Assess the quality of datasets for a Hadoop data lake.
  • Troubleshoot and debug any Hadoop ecosystem run time issue.
  • What you bring to the team

  • At least 5 years of professional experience in Business Intelligence solutions Data Warehousing, ETL, Reporting and Analytics
  • Data modelling experience with OLTP and OLAP
  • At least 2 years of experience with Hadoop engines.
  • Experience on Spark SQL, Scala
  • Expertise on the java essentials for Hadoop.
  • Good knowledge of concurrency and multi-threading concepts.
  • Good knowledge of SQL, database structures, principles, and theories.
  • Experience in working with complex data structures
  • Capable to meet deadlines and work within virtual teams in matrix organizations
  • Good English
  • University degree in Computer Science, Software Engineering. Information Technology or related technical field.
  • Nice to have :

  • Knowledge on Elastic Search, Flume, Kibana, HBase, Pig, Sqoop ;
  • Understanding the usage of various data visualizations tools like Tableau.
  • We offer

  • Hi-tech working environment
  • Career development
  • International exposure
  • Attractive and competitive compensation and benefits
  • Aplică
    Adaugați la favorite
    Eliminați de la favorite
    Aplică
    Email-ul meu
    Făcând clic pe "Continuați", sunteți de acord că nevoo colectează și procesează datele personale pe care le-ați furnizat în acest formular pentru a crea un cont nevoo și pentru a vă abona la alertele noastre prin e-mail, în conformitate cu Politica de confidențialitate . Puteți să vă retrageți consimțământul în orice moment, urmând și pașii .
    Continuă
    Formular