Data Engineer (with Snowflake)
HireOne IT Recruitment
Cluj-Napoca, Romania
1 zi în urmă
source : Just Join IT

MongoDB (nice to have)

NoSQL (nice to have)

Data Warehouse (regular)

Business Analysis (regular)

Relational Databases (regular)

Snowflake (advanced)

Is there any experienced Data Engineer here?

Our client is a forward-thinking, growth-oriented healthcare service and technology company that provides state-of-the-art pharmacy solutions.

Vertically integrated, they develop a PBM platform that delivers healthcare solutions, a direct-to-consumer product whose mission is to make prescription drugs more affordable, and an analytics platform that provides actionable insights in real-time.

We are looking for a Data Engineer to work on the analytics platform.

Your responsibilities will be :

  • Analyze and interpret complex data and provide resolutions to data issues.
  • Coordinate with data analysts to validate requirements, perform interviews with users and developers.
  • Perform tests and validate data flows and prepare ETL processes according to business requirements.
  • Perform ETL tuning and SQL tuning.
  • Perform data modeling and schema design, including dimensional and big data modeling.
  • Designing and implementing a data conversion strategy from legacy to new platforms.
  • Perform design validation, reconciliation, and error handling in data load processes.
  • Design and prepare technical specifications and guidelines including ER diagrams and related documents.
  • These qualifications will definitely help you :

  • Knowledge of AWS Infrastructure including S3, SNS, Ec2, CloudWatch, and RDS
  • Min 1+ years of experience on Matillion ELT tool and Snowflake Database.
  • Min 5+ years of experience with ETL / Data Transformation and one or more related products such as Informatica and SSIS.
  • Min 4+ years’ work experience in business intelligence, data warehousing, and logical / physical model design required
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience in building AWS Data Pipelines using Python / Scala, Apache Spark, SparkSQL, S3 Data Lake, Snowflake, Redshift
  • Good to have - Experience with any of the NoSQL data stores such as ElasticSearch, MongoDB, DynamoDB, Cassandra

    salary : 30.000 PLN / NET

    type : 100% remote

    additional : 20 days off, fully paid

    I'm waiting for you!

    Raportați această lucrare

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Email-ul meu
    Făcând clic pe "Continuă", acord nevoo consimțământ de a procesa datele mele și de a-mi trimite alerte prin e-mail, așa cum este detaliat în policyApplicația de confidențialitate a lui neuvoo. Pot să-mi retrag consimțământul sau să mă dezabonez în orice moment.