Consider a new opportunity
Project description :
Worldpay is one of the largest global payments (Fintech) businesses in the world that that has put data at the centre of the business by creating a new Data Transformation team that spans business and technology.
At the heart of this will be is our large Hadoop based Enterprise Data Platform that will be used to provide data insights both to our internal teams and our customers.
The Enterprise Data Platform is an innovative use of Big Data technologies to provide powerful analytic capabilities for Worldpay.
A key task is to load data from multiple operational systems into our Data Lake, and to combine this data in ways that were not possible before.
Much of the data is highly structured while some has the flexible characteristics more typical of big data . Because of the large amount of structured data we use Hive QL (essentially SQL) as well as Hadoops inbuilt data loading tools for much of the processing.
If you can meet the skills and experience described below, this would be a great opportunity to gain skills in the Hortonworks Data Platform and the related tools in an innovative and passionate environment.
Key Accountabilities :
Work within a dynamic development team (Bucharest and London)
Build data loading and transformation jobs using the Hortonworks toolset
Build data migration scripts to take all data stores in the cluster from one release to the next, without loss of data
Build tests to verify that loads and transformations work as intended
Follow agreed patterns to ensure consistency throughout the implementation
Skills & Experience :
Bachelors in Computer Science or related degree, or equivalent experience.
Very good knowledge of standard SQL (SQL-92) gained using large scale database systems such as Oracle, Netezza, DB2, Sybase.
Experience of data loading, either scripting or using ETL technologies
Development experience in an agile environment
Development experience in an environment using strict source-code control and release procedures.
A habit of thorough developer testing
The ability and willingness to create and maintain concise, accurate, readable, relevant documentation on our wiki (we use Confluence;
knowledge of other source code control systems will be useful).
The discipline of working with a ticketing system
A good understanding of UNIX systems, especially Linux.
The ability to develop reliable, maintainable, efficient code in most of SQL, Linux shell, Java and Python
The ability and willingness to learn the many tools in the Hortonworks Data Platform.
An understanding of file formats including csv, XML and JSON, as well as the related standards and technologies.
The ability to prioritize effectively in order to be productive in a highly dynamic environment.
Good problem solving and communication skills.
Experience of both batch and real time
Ability to communicate to a wide audience of professionals internally and externally.
Knowledge of the global payments industry or a good understanding of payments transaction life cycle an advantage.
Highly motivated, self-starter and mentor to team members.
Excellent organizational, communication and time management skills.
Technical Knowledge :
Very strong experience working on a UNIX / Linux Environment;
Strong Java knowledge
Strong shell scripting skills
Advanced level on SQL (ideally also PL / SQL or T-SQL)
Strong ETL / Data Warehouse skills;
Good data modeling skills (relational and dimensional);
Good analytics skills, working experience with data warehouse and complex reporting solutions;
Experience with Jira and Confluence
Experience with versioning systems (Git)
Understanding of Hive (HQL), HBase, Sqoop
Desirable Technical Knowledge
Knowledge of Scala and Functional Programming
Strong understanding of Hadoop ecosystem and at least one commercial Hadoop distribution
Hands-on Oracle Database development experience
Development experience with PERL
Previous experience with Spark
Previous experience with Kafka
Previous experience with CDC systems (Golden Gate, Attunity)