What's on Offer
- Working in Poznań office or 100% remotely an international and multicultural environment
- International IT career
- Availability of modern tools
- Pleasant working atmosphere
- Values and people oriented organizational culture
Job Description
- Designing automated Infrastructures that creates new auto-healing capabilities
- Creation and integration of storage technology and DFS independency into the solution landscape
- Data Pipeline development leveraging DevOps standards
- Use of Continuous Integration (CI) and Continuous Deployment (CD) to build Data Engines
- Creation of secure and private anonymization data systems using declarative programming languages that will interface between Data Silos, Data Engines and Graph Databases. These systems are fundamental for executing AI/ML workflows to accelerate drug discovery and to optimize the manufacturing processes
- Creation of holistic (e.g. integrated) data views through the ingestion, cleaning, linking, harmonization and contextualization of multiple systems. These views will enable our AI/ML work on complex high-value, multi-root cause problems
- Active involvement in all stages of the project lifecycle - from ideation to industrialization - in an Agile development environment. You will discover and develop new promising technologies in a collaborative way, create Proof-of-Concepts (POCs), Proof-of-Values (POVs) and Minimal-Viable Products (MVPs).
The Successful Applicant
- Bachelor's Degree - Engineering, Mathematics, Statistics, or Computer Science
- Minimum 5 years as a full-time software engineer
- Expertise with Data Engineering or Site Reliability Engineering
- Expertise with non-imperative paradigms - Scala, Haskell, F#, Typescript or OPA Rego
- Minimum 2 years working on Big Data platforms, preferably Spark
- Minimum 3 years deploying solutions on Cloud Platforms, preferably Azure or GCP
- Infrastructure-as-Code experience: Terraform, Ansible or Cloud templates (Azure, GCP)
- Expertise with container technologies: Kubernetes, Helm or Docker
- Professional DevOps experience: Jenkins, Azure DevOps, CI/CD or Junit
- Ability to design and implement logging, tracing, and application monitoring systems
- Experience building and maintaining APIs