For our Client, one of the well-known Financial Institutions, I am currently looking for a Data Architect / Technology Lead, with a great theoretical and practical knowledge of software engineering, data engineering, and building data platforms.
Data Architect / Technology Lead
Responsibilities:
- managing and supervising the Data Lakehouse development team,
- defining licensing needs and technology vendor products and services,
- general technical oversight of the maintenance of the solution and implementations of further enhancements/extensions,
- design and supervision of the solution architecture,
- general supervision of the data architecture and its development in the Data Lakehouse.
Requirements:
- theoretical and practical knowledge of software engineering, data engineering, and building data platforms,
- programming experience, in particular with distributed data processing (Spark), distributed data warehouse (Snowflake, Redshift), data pipeline building tools (Airflow), dbt data transformation and streaming with Kafka,
- knowledge of CI/CD principles,
- experience working in both private cloud environments based on Kubernetes and public cloud on AWS,
- DWH / Lakehouse (Snowflake, Redshift, Databricks SQL) (nice to have),
- experience with AWS (a must),
- SQL, Scala/Java and/or Python (a must),
- Spark,
- dbt (ELT) (nice to have),
- Apache Airflow (nice to have),
- Apache Kafka (a must),
- Data Integration (Airbyte and Kafka Connect),
- Kafka Streams,
- Kubernetes (on-premise, EKS),
- Docker (nice to have),
- Docker Compose (nice to have),
- Helm 3,
- Flux CD,
- Git (a must),
- GitLab CI/CD (a must).
The offer:
- hybrid work,
- both contract of employment or B2B,
- attractive projects.