Technical Requirements : A minimum of 3 years, ideally in start-up and / or in connection with innovative technology companies;
Agile project management, especially SCRUM and / or KANBAN;Data analysis using PYTHON or SCALA;Statistical methodologies;
Knowledge of one or more BI tools (Spotfire preferably, but the knowledge of other tools like SAP BI, table or QlikView / QlikSense is acceptable);
General knowledge of the environment Hadoop : Hue, Hive, Spark.Can be considered a plus : Knowledge of DevOps principles and tools (Gitlab CI, Jenkins);
Knowledge of a development tool (Eclipse, IntelliJ).Job Responsibilities : Analyze the data available in the operational systems or in the Datalake according to the business needs, in connection with the architects and the Business Analysts;
Manage the data integrations in the Datalake (on-premise or cloud), if necessary;Build the target analytical model to meet business needs;
Take charge directly or accompany the trades in the implementation of dashboards;For specific needs or exploration, do ad-
hoc analysis (SQL, R, Python).