Job Details

You would join a Research team that provides technical support to all our existing clients. Team works with existing solution daily, according to agreed service plan, such as: maintaining status quo, small requests (and minor changes), planned/agreed changes and platform updates.

Job Responsibilities

  • Ensuring data processes runs according to schedule.
  • Performing root cause analysis of ETL failures, fixing at root cause level.
  • Communicating with client regarding changes in source data or connectivity issues.
  • Optimizing ETL schedule according to system load or other requirements.
  • Creating new ETL’s.

Job Requirements

  • Attention to details.
  • Principal RDBMS knowledge.
  • SQL knowledge.
  • Basic understanding how data pipelines works (what ETL means).
  • Knowledge of at least one programming language (Python, Scala, Java).
  • PL/SQL.

Skill & Experience

  • Working 3+ year with Hadoop ecosystem/tools knowledge.
  • Spark knowledge.
  • Working 3+ year with of Azure, Google Cloud or AWS services.
  • You are familiar using Jira and Confluence in your workflow

Job Overviews

  • Location:


  • Job Title:

    Sr. Big Data Engineer

  • Hours:

    48h / week

  • Rate:

    €12.5 - €18.75 / hour

  • Salary:

    €24k - €36k netto

Apply Now