Company culture :
Sofrecom Maroc stands out for its strongly collaborative culture, where people, trust and employee well-being are central priorities. A close management approach promotes support, accountability and long-lasting teamwork. This collaborative foundation is complemented by a notable innovation dimension, encouraging initiative, agility and experimentation. Structured processes ensure reliability and operational efficiency, while performance-driven practices play a more supportive and balanced role.
Job :
Description of the mission and activities
In 6 keywords: #Cloud, #Data, #Architecture, #Automation, #AI, #Agility
In one sentence
Within INNOV/DATA-AI, you will analyze the Data and Big Data needs of the group's entities and BUs (DATA AI "clients"), propose adapted architecture solutions – mainly oriented towards GCP – and support client teams in their implementation and skills development.
In a few words, your activities
In collaboration with the Lead Tech DATA AI and the Security Director DATA AI, you will analyze Data and Big Data needs with an end-to-end vision: understanding business needs, proposing adapted architectures, participating in developments, and supporting client teams.
Your missions are part of an international context, in Cloud, hybrid, and/or on-premise environments.
What is expected
Analyze client needs, whether for migrating legacy Data architectures or for new use cases.
Define architectures aligned with operational and organizational needs (development, production, FINOPS, process, etc.).
Contribute to DATA AI projects by writing guidelines and developing blueprints.
Maintain ongoing technological monitoring and share it with the team.
Participate in project implementation and interact with various roles (project managers, release managers, data engineers, etc.).
Be proactive in proposing new tools or architectures.
Regularly communicate with our teams and clients in France and internationally.
Required profile :
Skills and qualities expected
Soft skills
Dare: be proactive in suggesting improvements, corrections, and new technologies (continuous improvement).
Cooperate and play as a team: be a good communicator and demonstrate pedagogical skills.
Be familiar with Agile development methods.
Be comfortable in English, both written and spoken.
Technical know-how
Cloud Architecture (GCP primarily):
Design and development of scalable, elastic, automated, reliable, and secure architectures.
Mastery of encryption and data protection aspects.
Good knowledge of GCP services (APIs, CLIs, Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, App Engine…).
Resource management (IAM, projects, networks, VPC, subnets, IP addresses, routes, firewall…).
2. Cloud Experience:
Participation in public cloud migration projects (IS, Data Lake, Data Warehouse…).
Knowledge of issues related to organization, user management (IAM), and billing.
Good mastery of serverless solutions (Cloud Functions, AWS Lambda) and Big Data PaaS (Big Query, Azure DWH…).
3. Data & Big Data:
Mastery of concepts related to data lakes, data warehouses, database design, and modeling (star/snowflake schema).
Skills in data modeling for BI and Big Data.
Knowledge of on-premise data processing and manipulation tools (Spark, Spark Streaming, Hive…) and Cloud tools (Databricks, Dataflow…).
Mastery of data transfer solutions (Informatica, Talend, Nifi, Kafka, Event Hub, Data Factory, Pub/Sub…).
4. Infrastructure & DevOps:
Knowledge of OpenStack, Kubernetes/Openshift, Starburst environments.
Mastery of Infrastructure as Code tools (Terraform, Ansible, Helm, APIs…).
Good knowledge of CI/CD processes and agility in general.
Skills in MLOps and scripting languages.
5. Cloud Security:
Knowledge of security standards and best practices in Cloud environments.
6. Additional assets:
Successfully led a migration or implementation of public cloud projects.
Knowledge of AWS or Azure is a plus.