Big data engineer - Datalake Core team
ING Romania
Bucureşti
5 zile în urmă

The Mission :

Data is a game-changer. It’s all around us. Implementing a uniform approach as to how we manage data throughout the bank allows us to enhance the customer experience, lower IT operational costs, and swiftly and correctly report to external regulators.

Your Day-to-Day :

Contributing to design, implementation and lifecycle management of Data Lake by :

  • Administration, operation and deployment of cloud environments
  • Automate deployment, testing and configuration for HortonWorks Ecosystem (HDP 2.6 and 3.1, HDF 3.4)
  • Research and development of new features and components. Integrate new Hadoop projects : Airflow, Druid and Ceph
  • Migration to the new CDP and CDF
  • Achieve High Availability and Disaster Recovery on 2 datacenters
  • Optimization & automation of workflows and activities
  • Tune performance and ensure high availability of services
  • Integrating and / or engineering new technologies. Configure / add new services as necessary
  • Preventive maintenance activities (Monitor system performance and utilization, data transfer, backup, etc.)
  • Deployment of the new changes through different environments dev, test, uat, prod
  • Fault Management, Problem Management, Incident Management & Configuration management, including standby duties outside business hours.
  • Fault handling and corrective actions (troubleshooting, reporting and improvement plan)

  • Provide work instructions and improvements of the processes
  • Travel to Amsterdam will be required.

    Tools used in your day by day activities :

  • Big data : Apache Hadoop (Ambari, Ranger, Hive, HDFS, Spark / Flint, NIFI)
  • OS : Red Hat Enterprise Linux Server
  • Databases : MySQL, PostgresSQL
  • Continuous integration : Jenkins / TFS
  • Continuous deployment and delivery : Ansible
  • Versioning : GIT
  • Real time streaming Apache Kafka
  • Monitoring : Kibana-Logstash (data visualization plugin for Elastic Search), Graphite for server and infrastructure, Axway SentinelforData flow monitoring
  • Content management : Confluence
  • Agile planning and management : Service Now
  • What you bring to the team

  • In depth experience in administering, tuning and debugging Hadoop ecosystems
  • In depth experience in administering Linux systems
  • In depth experience with shell scripting / python
  • Experience with CI / CD tools
  • Experience in processing of very large data sets
  • General experience in data base administration ( Oracle) and SQL knowledge
  • General experience as a programmer is a plus
  • Nice to have :

  • Experience working with real time streaming
  • Experience in automatic testing is a plus
  • Involved in open source
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Aplică
    Adaugați la favorite
    Eliminați de la favorite
    Aplică
    Email-ul meu
    Făcând clic pe "Continuă", acord nevoo consimțământ de a procesa datele mele și de a-mi trimite alerte prin e-mail, așa cum este detaliat în policyApplicația de confidențialitate a lui neuvoo. Pot să-mi retrag consimțământul sau să mă dezabonez în orice moment.
    Continuă
    Formular