Capgemini FS - Business Unit (Fintech) is hiring! 

 

Department/project description:

 Insights & Data practice delivers cutting-edge data centric solutions. Most of our projects are Cloud & Big Data engineering. We develop solutions to process large, also unstructured, datasets, with dedicated Cloud Data services on AWS, Azure or GCP. We are responsible for full SDLC of the solution: apart from using data processing tools (e.g., ETL), we code a lot in Python, Scala or Java and use DevOps tools and best practices. The data is either exposed to downstream systems via API, outbound interfaces or visualized on reports and dashboards. Within our AI CoE we deliver Data Science and Machine Learning projects with focus on NLP, Anomaly Detection and Computer Vision.

Additionally, we are exploring the area of Quantum Computing, searching for practical growth opportunities for both us and our clients.

Currently, over 250 of our Data Architects, Engineers and Scientists work on exciting projects for over 30 clients from different sectors (Financial Services, Logistics, Automotive, Telco and others)

Come on Board! :)

Your daily tasks:

  • creating architecture for systems processing large and unstructured datasets (Data Lake Architecture, Streaming Architecture),
  • designing modern DWH / Big Data solutions in the cloud and in the Continuous Delivery / Continuous Integration environment,
  • data model design, data flow documentation,
  • supporting the Team as a Technical Leader, ensuring system compliance with standards and best practices in software development.

Frequently used technologies:

  • Cloud Data Services: AWS / Azure / GCP
  • Databases: SQL / NoSQL
  • Programming: Python/Scala/Java
  • Cloud DWH: Snowflake / Redhshift / Synapse

Our expectation:

  • experience as an Architect and/or Technical Leader in Cloud or Big Data projects in the field of data processing and visualization (in different phases of SDLC);
  • practical knowledge of one of the following clouds: AWS, Azure, GCP in the area of Storage, Compute (+Serverless), Networking and Devops supported by work on commercial projects;
  • familiarity with at least few of the following technologies: Cloud Dataflow, BigQuery, Cloud Dataproc, Cloud KMS, Pub/Sub, Google Kubernetes Engine, Apache Beam, Kafka, Spark, Snowflake, Data Lake Gen2, Event Hub, Data Factory, DataBricks, Azure DWH, Azure Function, Power BI, Terraform, Redshift, Glue, Athena, Kinesis, Lambda;
  • at least basic command of one of the programming languages: Python/Scala/Java/bash;
  • good command of English.