As Acronis is dedicated not just to Cyber Protection but to the general protection of its potential and current employees, recruitment and onboarding process are being held online during the current global COVID-19 situation.
Acronis leads the world in cyber protection - solving safety, accessibility, privacy, authenticity, and security (SAPAS) challenges with innovative backup, security, disaster recovery, and enterprise file sync and share solutions that run in hybrid cloud environments : on-premises, in the cloud, or at the edge.
Enhanced by AI technologies and blockchain-based data authentication, Acronis protects all data, applications and systems in any environment, including physical, virtual, cloud, and mobile.
With dual headquarters in Switzerland and Singapore, Acronis protects the data of more than 5 million consumers and 500,000 businesses in over 150 countries and 20 languages.
ACEP DataWarehouse is a big project used as a source for product managers insights about products usage and another user's metrics.
It is also a data source for CyberCube data warehouse. Data engineer in Performance engineering team will be responsible for overall data warehouse data schema design, sources management and data flows control.
Internal Data Warehouse business model ownership Data schema design and ownership, including comprehensive schema documentationInvestigation of current products data flows and operational databases which are subject of data semantic, lineage etcMediation between Services team, DWH developers and BI team to quickly resolve issues or introduce changes in data model
Data governance & Data quality Introducing an automated data quality control, including anomalies detection and data lossBusiness support of data loss incidents troubleshooting or data migrationDefining requirements for systems and services analytics
Data analysis Manual and automated DWH data analysis and insights searchSearch for hidden dependencies (correlations) of clients behaviour based on their cohorts when interacting with the productParticipation in the creation of business metrics and their visualisation (dashboard) for decision making
SKILLS & EXPERIENCE :
3+ years experience with Data Warehouse architecture and design, data flows design, ETL implementation
Deep knowledge in SQL, experience with complex queries implementation, profiling and optimisation
Python : experience with NumPy, Pandas, Scikit-learn or similar
Would be a plus :
Experience with Hadoop stack, Hive and Pyspark;
Experience with TensorFlow (Keras) or Pytorch
Experience with tools : git, Confluence, Jira, DevOps
WE OFFER :
Supplemental Health insurance
Tickets for conferences and seminars
Challenging atmosphere and interesting projects
Future career development in a multinational company