For our Client, one of the well known Financial Institutions, I am currently looking for a Data Engineer, with outstanding attitude, bravery within the business, and great knowledge of on-line payments.
Data Engineer
Responsibilities:
- data Lakehouse management and development,
- operational work on maintaining the solution and implementations of further improvements/expansions,
- liaising with Data OPS and BI Developers,
- integration of systems and data,
- building data flows and orchestrating task,
- creating streaming data pipelines,
- designing DWH solutions (e.g. Star schema, Vault 2),
- building ETL or ELT processes,
- developing code for applications that massively processing data,
- testing the solution technically.
Requirements:
- DWH / Lakehouse (Snowflake, Redshift, Databricks SQL) (nice to have),
- experience with AWS (a must),
- SQL, Scala/Java and/or Python (a must),
- Spark (nice to have),
- dbt (ELT) (nice to have),
- Apache Airflow (nice to have),
- Apache Kafka (a must),
- Data Integration (Airbyte and Kafka Connect) (nice to have),
- Kafka Streams,
- Kubernetes (on premise, EKS),
- Docker (nice to have),
- Docker Compose (nice to have),
- Helm 3,
- Flux CD,
- Git ( a must),
- GitLab CI/CD (a must),
- Terraform ( nice to have).
The offer:
- 100% on-line,
- both contract of employment or B2B,
- attractive projects.