How would you
like to work in IT & Communication ,
toying with cutting edge technologies and enjoying your life? Come closer to #LifeAtOrange .
you from the safety of your home and we’ll prepare you for the challenges of
this period for the time being, our activity is carried out remotely.
What we're looking for
Big Data and Artificial Intelligence are today powerful drivers for the Orange Group, enabling us to
reinvent customer relationship, optimize and automate the management of our networks, improve
the customer experience and provide them with a competitive advantage.
In this context, the main mission of the new Data & AI Division is to transform Orange into a "data
driven" company, to define the Group's standards in terms of Data & AI, and to accelerate the
development of use cases, data products and services. This Division will be responsible for the entire
Within Data & AI, in the AIT Tools & Technology Department, from Research to Delivery, we provide
knowledge extraction and understanding technologies from voice, text, social data and dialogue. We
support and accelerate the Group’s transformation thanks to our own cloud infrastructure, security
expertise and data.
The CARS (Cloud Architecture & Security) team is in charge of supporting this transformation,
especially by leveraging the Public Cloud as a catalyst. We are working to assist countries both in terms
of expertise (consulting, governance, blue print writing, etc.) and operational projects (move to cloud
projects, CloudOps implementation, data warehouse and datalake migration, Cloud security and
admin, etc.) as needed.
What you'll be doing
Data / BI Engineer in charge of Data / BI projects on public (mainly GCP) or Hybrid Cloud environments
for the different Orange countries.
Your main missions will be to :
Support the teams of the Business Units and subsidiaries for their Data Move to Cloud projects :
o understand the data needs (batch, real time / event streaming) and constraints
o design Cloud BI data models
o design and implement / develop data extraction, transformation, and loading
processes by writing custom data pipelines
Take in charge end-to-end project design and build analytics activities
Participate in various technical activities of projects to develop and maintain your operational
skills (data work, integration, platform administration)
You will be part of an international team (half in France and half in Romania) that leverages collective
intelligence and expertise to create value in the countries.
It is composed of about 25 people : a dozen Cloud and Data architects, engineers (DevOps, ML,
Data / BI, Admin) and security experts.
The creation of this new position will allow us to start new projects and reinforce the already existing
ones. You will be involved in the different stages of the projects (study, development, production)
and you will work on :
Participating to Move to Cloud projects from On-premise databases to Google Cloud Platform.
Contributing to implementation of new data platform on GCP, including leveraging new tools
Contributing to datawarehouses and analytics projects including Big Query, Apache Beam
Dataflow), Cloud Composer (Apache Airflow), etc.
Design, develop, test, optimize, scale and deploy BI solutions to Google Cloud Platform.
What you need to know / have
You have :
Strong skills in SQL (i.e. BigQuery / T-SQL / Oracle / Teradata SQL), Data Appliances, Cloud Data
Warehouses and Data Lakes.
Strong knowledge on design and implementation data processing pipelines with ETL / ELT tools
like Stitch, Informatica, Talend, Dataform, AWS Glue, Google Dataflow or Apache Spark.
Experience programming in one or more general purpose programming languages, with a
preference for Python, Java or Scala (specifically for Apache Spark or Dataflow development)
Good knowledge of Data Visualisation technologies such as Looker, Tableau, Business Objects,
Power BI, QlikView.
Experience of CI / CD related toolsets such Jenkins, Azure Devops, Gitlab is preferred.
A good knowledge of Infrastructure as Code tools (Terraform, Ansible, Helm, APIs,...)
A good knowledge of the full stack of GCP platforms (knowledge of AWS or AZURE would be a
plus) via their APIs or CLIs
A good knowledge of Serverless solutions (Cloud functions, Amazon lambda) and PaaS
solutions related to Big Data (Big Query, AWS Redshift, Azure Synapse, ...)
A good understanding of data lakes, database design, data warehouses and data modeling
star / snowflake schema). Skills on data modeling into Big Query would be a real plus.
A good communication (both verbal and written) and stakeholder management skills, gained
through collaboration with technical peers, non-technical project team members and user
Advanced GCP certification (like Data Engineer for example) would be a plus
You are :
Responsive to requests and creative in your responses and proposals
Pedagogical, eager to learn and able to share your knowledge-
Autonomous in your work and able to make proposals
You appreciate team work and you are familiar with AGILE development method
You also know the issues of a project planning and to deliver on-time
You have :
The ability to analyze and solve business problems with a strong results-oriented culture.
Excellent ability to work in English in an international context (and French would be a real plus)
Synthesis mind and ability to explain things in layman's terms
Taste for dialogue with your "client", understanding its needs and business context and
constraints and making him succeed
What's in it for you
Who are we?
Orange Services is one of the
largest technology hubs in the Orange Group, working internationally for both
Orange corporate functions and country operations.
As a technology services
company, IT is in our DNA but our teams also work in other domains including
mobile networks and a number of commercial and business functions.
work for a #TopEmployer company.
You could be part of an organization
where great colleagues and team spirit support your professional
and let’s have a remote-talk.