Insights & Data practice delivers cutting-edge data centric solutions.
Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP.
We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards.
Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.
Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.
Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)
Come on Board! :)
Your daily tasks:
- creating architecture for systems processing large and unstructured datasets (Data Lake Architecture, Streaming Architecture),
- designing modern DWH / Big Data solutions in the cloud and in the Continuous Delivery / Continuous Integration environment,
- data model design, data flow documentation.,
- supporting the Team as a Technical Leader, ensuring system compliance with standards and best practices in software development.
Frequently used technologies:
- experience as an Architect and/or Technical Leader in Cloud or Big Data projects in the field of data processing and visualization (in different phases of SDLC),
- practical knowledge of one of the following clouds: AWS, Azure, GCP in the area of Storage, Compute (+Serverless), Networking and Devops supported by work on commercial projects,
- familiarity with at least few of the following technologies: Cloud Dataflow, BigQuery, Cloud Dataproc, Cloud KMS, Pub/Sub, Google Kubernetes Engine, Apache Beam, Kafka, Spark, Snowflake, Data Lake Gen2, Event Hub, Data Factory, DataBricks, Azure DWH, Azure Function, Power BI, Terraform, Redshift, Glue, Athena, Kinesis, Lambda,
- at least basic command of one of the programming languages: Python/Scala/Java/bash,
- good command of English (German language would be an asset).
- employment contract for an indefinite period from the first day;
- hybrid, flexible work model;
- possibility of obtaining increased deductible costs for creative work;
- co-financing for equipping the workplace at home;
- development opportunities, including:
- substantive support of leaders in projects,
- a wide range of internal and external training courses (technical, language, leadership),
- support in certification in various areas,
- mentoring and real influence in shaping your career path,
- access to a database of over 2000 training courses on Pluralsight, Coursera, Harvard platforms,
- internal communities (e.g. Agile, IoT, Digital, Security, Women@Capgemini),
- the opportunity to participate in conferences both as a listener and as an expert,
- relocation package;
- benefits as part of the social package (e.g. Multisport card, medical care for the whole family, group insurance on preferential terms, cafeteria).