Department/project description:

Insights & Data practice delivers cutting-edge data centric solutions.

Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.

We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.

Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.

Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.

Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)

Come on Board! :)

Your daily tasks:

  • designing, implementing and testing cloud computing solutions using Snowflake technology;
  • creating, monitoring and optimization of ETL/ELT processes;
  • migrating solutions from on-premises to public cloud platforms.

Frequently used technologies:

  • Snowflake
  • SQL
  • Python
  • Cloud Data Services: Azure / AWS / GCP

Our expectation:

  • at least 3 years of commercial experience working with Snowflake;
  • good knowledge of SQL language and data warehousing concepts;
  • at least theoretical knowledge of one of the Cloud technologies: AWS, Azure or GCP;
  • good command of English and/or German;
  • knowledge of Apache Airflow and Python would be an advantage.