What will you do with us?
- You will design and implement solutions to process large and unstructured datasets (Data Lake Architecture, Streaming Architecture),
- you will implement, optimize and test modern DWH/Big Data solutions based on Google Cloud Platform and Continuous Delivery/Continuous Integration environment,
- you will be responsible for data processing efficiency improvement and for migrations from on-prem to public cloud platforms,
- you will be building and supervising an internal GCP development and training program as well as leading workshops and mentoring sessions.
We are looking for you if:
- you have at least 5 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle)
- you have practical knowledge of GCP in Storage, Compute (+Serverless), Networking and DevOps areas supported by commercial project work experience;
- you are familiar with several of the following services: BigQuery, Dataflow, GKE, Cloud Bigtable, Pub/Sub, Dataproc, Google Data Studio;
- you have good knowledge of one of programming languages: Scala (preferred), Python or Java
- you are organized, independent, willing to share the knowledge;
- you have very good command of English.