What will you do with us?

  • You will be creating architecture for systems processing large and unstructured datasets (Data Lake Architecture, Streaming Architecture);
  • designing modern DWH / Big Data solutions in the cloud and in the Continuous Delivery / Continuous Integration environment;
  • designing data model and documenting data flow in the system;
  • working with the Team as a Technical Leader, ensuring system compliance with current standards and best practices in software development.

We are looking for you, if:

  • you have experience as an Architect and/or Technical Leader in Cloud or Big Data projects in the field of data processing and visualization (in different phases of systems development);
  • you have practical knowledge of one of the following clouds: AWS, Azure, GCP in the area of Storage, Compute (+Serverless), Networking and Devops supported by work on commercial projects;
  • you are familiar with at least few of the following technologies: Cloud Dataflow, BigQuery, Cloud Dataproc, Cloud KMS, Pub/Sub, Google Kubernetes Engine, Apache Beam, Kafka, Spark, Snowflake, Data Lake Gen2, Event Hub, Data Factory, DataBricks, Azure DWH, Azure Function, Power BI, Terrform, Redshift, Glue, Athena, Kinesis, Lambda;
  • you have at least basic command of one of the programming languages: Python/Scala/Java/bash;
  • you have good command of English (German language would be an asset).