Insights & Data practice delivers cutting-edge data centric solutions.
Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.
We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.
Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.
Additionally, we are exploring the area of Quantum Computing and Generative AI, searching for practical growth opportunities for both us and our clients.
Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)
Come on Board! :)
Your daily tasks:
- Design and implementation of solutions processing large and unstructured datasets (Data Mesh , Data Lake or Streaming Architecture)
- Implementation, optimization and testing of modern DWH/Big Data solutions based on AWS cloud platform and Continuous Delivery / Continuous Integration environment
- Data processing efficiency improvement, migrations from on-prem to public cloud platforms.
Frequently used technologies:
AWS
Python/PySpark
Glue
Redshift
Snowflake
SQL
Terraform
Our expectations:
- At least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle)
- practical knowledge of the AWS cloud in Storage, Compute (+Serverless), Networtking and DevOps areas supported by commercial project work experience.
- theoretical AWS cloud knowledge supported with certificates (for example DVA-C01, SAA-C02, SAP-C01, VDS-C01, DAS-C01)
- familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark
- At least basic knowledge of one of this programming languages: Python/Scala/Java/bash
- Very good command of English (German language would be an advantage).
Our offer:
- permanent employment contract from the first day,
- hybrid, flexible working model,
- possibility of using increased tax-deductible costs in the case of creative work,
- co-financing to equip a workplace at home,
- development opportunities:
- substantive support from project leaders,
- a wide range of internal and external trainings (technical, language, leadership),
- certification support in various areas,
- mentoring and a real impact on shaping your career path,
- access to a database of over 2,000 training courses on Pluralsight, Coursera, Harvard platforms,
- internal communities (including Agile, IoT, Digital, Security, Women@Capgemini),
- the opportunity to participate in conferences both as a listener and an expert;
- benefits as part of the social package (including: subsidy to the Multisport card, medical care for the whole family, group insurance on preferential terms, cafeteria).