We’re expanding the Data team and we’re looking for people passionate about Data, paying attention to accuracy, latency with interests in Machine Learning, Predictive Analyses, stream consumptions based on Kafka, R, Cloud architecture.
The Role We are looking for a Database Engineer with proven data integration skills to develop and support Integration, ETL (ELT), Data Warehouse and Reporting solutions, as well as data driven and ML based services such as personalization or customer profiles offered as APIs and streams to other business units.
Working within an agile environment you will team up with Data Warehouse, ML, data services and QA engineers and system integrators to work on projects prioritized by various areas of the business;
The successful candidate will have good technical and problem solving skills, with a positive and results driven attitude, a strong communicator who is capable of interacting with both technical and non-
technical people. Key Responsibilities / Duties :
Work as an effective member of a scrum team - understanding and contributing to the agile delivery process, and taking ownership for your team’s software quality from concept to production
To promote the production of reusable code and modules where appropriate with a view to maximising effort and reducing development costs applying principles of Agile delivery.
To ensure risks and issues are identified in a timely manner and effectively communicated with proposed resolution and mitigation strategies to the Data Delivery Manager.
Work as an effective member of a scrum team - understanding and contributing to the agile delivery process,and taking ownership for your team’s software quality from concept to production
Write and maintain functional and technical specifications
Monitor, optimize and trouble shoot database, microservice, stream and ML service performance
Analyse code for problem resolution
Thorough, demonstrable unit testing Experience & Qualifications : Essential
Proven development skills in at least one of the following : o Microservices o Kafka or another message based streaming platform o Machine learning and analytics in either python or R o Unix Scripting o Open source NoSQL technologies (e.g.
MongoDB, CouchDB, ElasticSearch).
o Talend o AWS (preferable) or experience of data engineering on another leading cloud vendor such as GCP or Azure
Good SQL knowledge and experience of relational databases
Dimensional data modelling
Good knowledge of object oriented or functional programming in either Python or Java
Experience of database performance analysis and design
Unit testing knowledge
Demonstrable experience with high-volume data loads (terra bytes and above)
Knowledge of ETL from highly transactional (1000s records / second) OLTP systems
Exposure to Continuous Delivery / Continuous Integration tools (e.g.
A proven ability to influence technical decisions in a fast moving commercial environment
Demonstrates exceptional communication, interpersonal skills and consistent high energy levels Desirable :
Experience of large data warehouse (10 TB+) with multiple sources and outputs
Knowledge of the online gaming / gambling industry
Educated to degree level in a science or technology related field Key Skills and Attributes :
Proactive work ethic with the ability to deliver results and meet challenging deadlines
Passion & flexibility to work the hours required to see projects to completion in a timely, accurate & efficient manner.
Attention to detail with a high degree of pride in work produced.
Proven ability & desire to innovate.
Strong analytical skills.
Enthusiasm for the software development process.
Good English language skills.