What will you do with us?
- You will design and implement solutions to process large and unstructured datasets (Data Lake Architecture, Streaming Architecture),
- you will implement, optimize and test modern DNH/Big Data solutions based on Azure cloud platform and Continuous Delivery / Continuous Integration environment,
- you will be responsible for data processing efficiency improvement and for migrations from on-prem to public cloud platforms.
We are looking for you, if:
- you have at least 2 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle),
- you have practical knowledge of the Azure cloud in Storage, Compute (+Serverless), Networking and Devops areas supported by commercial project work experience,
- you have theoretical Azure cloud knowledge (for example from the MS Learn courses) supported with certificates (for example DP-900, DP-200/201, AZ-204, AZ-400),
- you are familiar with several of the following technologies: Data Lake Gen2, Event Hub, Data Factory, DataBricks, Azure DWH, API Azure, Azure Function, Power BI,
- you have and least basic knowledge of one of this programming languages: Python/Scala/Java/bash,
- you have very good command of English (German language would be an advantage).