Department/project description:

Insights & Data practice delivers cutting-edge data centric solutions.

Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.

We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.

Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.

Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.

Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)

Come on Board! :)

Your daily tasks:

  • design and development of data processing solutions,
  • implementation of Data Lake and Data Mesh architecture in public cloud platforms,
  • implementation, optimization and testing of modern cloud solutions and in the Continuous Delivery / Continuous Integration environment,
  • working in Cloud environments using InfrastructureAsCode.

Frequently used technologies:

  • Cloud Data Services: Azure / AWS / GCP
  • Python / PySpark
  • SQL
  • IaC: Terraform, Cloudformation

Our expectation:

  • at least 3 years of commercial experience in AWS, Azure or GCP,
  • knowledge of one of following languages: Scala/Java/Python/C#,
  • knowledge of at least one relational database system and SQL language,
  • familiarity with one or more of the listed (or similar) technologies: Jenkins, Terraform, Cloud Formation, Docker, Kubernetes,
  • very good command of English (willingness to learn German would be an advantage).