Job description
Company culture :
Capgemini is characterized by a predominantly collaborative culture, placing people, trust and teamwork at the core of its practices. A close management approach fosters guidance, empowerment and skills development within a supportive environment. This culture is reinforced by a strong organizational dimension, ensuring process rigor, reliability and operational efficiency. It is complemented by a competitive component focused on performance and customer satisfaction, while a more moderate innovation dimension supports the continuous evolution of services and expertise.
Job :
Within our project teams, your main mission will be to design, develop, and deploy advanced Artificial Intelligence solutions, focusing on Machine Learning and Deep Learning models to meet the strategic needs of the company.
As a GCP Data Engineer, you will be involved in various missions such as:
- Define and implement robust and scalable AI architectures.
- Develop supervised and unsupervised Machine Learning models (classification, regression, clustering).
- Optimize data pipelines for training and deploying models.
- Mentor and support technical teams on AI best practices.
- Stay updated on innovative frameworks and algorithms.
- Collaborate with Data Engineering and DevOps teams for integrating models into production
Required profile :
- Graduated with a Master's degree (Bac+5) in computer science or equivalent.
- Proven experience (>= 5 years) in a similar role.
- Languages: Python (required), R, Java (optional).
- AI/ML Frameworks: TensorFlow, PyTorch, Scikit-learn.
- Big Data: Spark, Hadoop (desired).
- MLOps: MLflow, Kubeflow, Docker, Kubernetes.
- Databases: SQL, NoSQL.
- Cloud: AWS, Azure, or GCP (at least one).
- Advanced knowledge: NLP, Computer Vision, generative models (GAN, LLM).