ジョブファンクション: インフォメーション システム（IS)
In the context of new projects, major for the Group, new challenges around the Data, the O+O Experience, the Architect D2C Domain wants to strengthen. This is an innovative field, in full growth. You will work in a dynamic and diverse context, offering a wide range of motivating responsibilities, varied missions and strategic projects.
We are looking for a Data Architect to join our growing team of Digital Architect experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Architect will support our software developers, database engineers, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Create and maintain optimal data pipeline architecture,
Develop database solutions to store and retrieve company information
Analyze structural requirements for new software and applications
Design conceptual and logical data models and flowcharts
Improve system performance by conducting tests, troubleshooting and integrating new elements
Work with data engineers and analytics experts to strive for greater functionality in our data systems
Optimize new and current database system
Define security and backup procedures
May require some travel in the EMEA zone
We are looking for a senior candidate with experience in a Data Architect role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. Experience supporting and working with cross-functional teams in a dynamic environment
Technical and professional skills required :
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Experience with GCP, Google Cloud Platform : Big Query
AWS cloud services: EC2, EMR, RDS, Redshift
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc
Equivalent Bac +4 or +5 (general engineer, Master 1 or 2)