Data Engineer
Cluj-Napoca, Romania
1 zi în urmă


Here is the breakdown of the most important parts of your role :

  • Work along with the Data Analyst and define the data processing and validation flow or dashboards;
  • Design and develop data flows to support large data sets and meet business requirements;
  • Build processes which support data transformation, data structures, metadata or dependencies;
  • Propose and develop solutions for the Relational and Dimensional Model based on the requirements;
  • Implement or optimize complex solutions based on requirements, both on premise and in cloud;
  • Monitor and maintain the data flows in optimal condition;
  • Has the ability to work with structured or unstructured data;
  • Propose solutions that can improve the data flows by automation of manual processes, data validation etc.
  • Qualifications and Experience

    We are looking to expand our team with open, thoughtful and adaptable colleagues who have 3+ years of working experience with the following :

    Mandatory skills :

  • Strong background in PL / SQL;
  • Working experience with MS Stack (MS SQL Server, SSIS,SSAS, SSRS) or Oracle Stack;
  • Working experience in ETL;
  • In-depth analysis of query execution plan;
  • Knowledge of the Kimball model;
  • Agile delivery experience;
  • The ability to take the initiative, drive the project and innovate;
  • Having a proactive attitude towards solving problems;
  • Designing and developing different kind of reports such as analysis reports, ad hoc and standard reports;
  • Experience in tackling performance tuning at physical / database level (database settings and options);
  • Knowledge principles of database design, data acquisition and data access analysis and design;
  • Team oriented attitude and the ability to work well with others in order to achieve a common goal;
  • Good spoken and written English language skills.
  • Desirable skills :

  • Power BI, Tableau suite and / or Pentaho, is a plus;
  • Knowledge of data warehouse concepts is a plus;
  • Experience with Apache Spark and Python programming is a plus;
  • Experience with one of the major Cloud providers (AWS, Azure, Google Cloud) is a plus;
  • Experience in developing data processing tasks using pySpark such as reading data from external sources, merge data, perform data enrichment and load into target data destinations is a plus.
  • Additional Information

  • Private health insurance, monthly sports allowance & meal tickets;
  • Loyalty annual leave days & replacement days for weekend bank holidays;
  • Various options for purchasing Endava shares at a preferential price;
  • Referral bonus;
  • Work from home and flexible working hours;
  • Multiple offers and employee discounts;
  • Ongoing learning opportunities complex projects, trainings, coaching, conferences, workshops, certifications, online learning platforms subscriptions etc. ;
  • Diverse company social life - events, sports tournaments, team buildings, passion groups (e.g. hiking, book club, green team);
  • Multicultural environment - working with colleagues and clients across different countries.
  • Raportați această lucrare

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Email-ul meu
    Făcând clic pe "Continuă", acord nevoo consimțământ de a procesa datele mele și de a-mi trimite alerte prin e-mail, așa cum este detaliat în policyApplicația de confidențialitate a lui neuvoo. Pot să-mi retrag consimțământul sau să mă dezabonez în orice moment.