Imagine projects
that make a difference.
About
Developer in all forms of data processing: from visualisation to the creation of processing processes through to machine learning algorithms (AI).
I've been working as a independent data engineer since 2018 for consulting firms
, large companies, startups and SMEs.
I support my clients
on their digital transformation trajectory gthanks to the diversity of my skills acquired over the course of my assignments in,
data processing (ETL), setting up data storage architecture de
on the Cloud and all types of data enhancement (visualisation, machine learning algorithms, automation).
As I'm heavily involved, I keep a close eye on the level of performance expected by my clients.
I love the variety of encounters and challenges, as well as taking part in new technological challenges that have a positive impact. Thanks to my entrepreneurial activity and some of my clients, I've been able to help promote accessibility and digital access for people with disabilities.
I'm based in Montpellier, South of France, but my work takes me to Paris on a regular basis and I work most of the time remotely. I'm always on the lookout for new technologies and topics, which I share via my Blog.
Skills
Data extraction and transformation
- Development of data processing pipelines ETL (Extract Transform Load) in Python.
- Implementation of tools such as DBT (SQL queries orchestration) and Airflow (ETL Python).
- Data extraction from third-party services via API interface (such as Google, Slack or Teams) and storage services (databases, FTP servers, file storage).
- Implementation of data Scraping scripts to extract data from websites
Databases configuration
- Definition with the client of the data schema with the attributes and indicators relevant to end use
- Production launch on Cloud services and security
- Development of connectors to insert data
- Writing of SQL queries
Types of databases I'm used to working with :
Datawarehouse(BigQuery), DataLake(Bucket S3, Google Cloud Storage ou FTP), PostgreSQL (relationnelle en SQL), MongoDB (document oriented), InfluxDB (time series) and Neo4j (graph oriented)
Making the most of existing data
- Dashboard creation (Data Studio, Tableau, PowerBI)
- Development of predictive algorithms (Machine Learning) in Python: Clustering, Neural networks et regressions for exemple.
- Creation of automations using NoCode tools such as Zapier, Make or N8N for well-defined use cases.
Outils de Business Intelligence avec lesquels j'ai l'habitude de travailler :
Google Data Studio, Dash, Grafana, Tableau Software et PowerBI
Création de Backend Web / API
- Development of API servers to open data from a database, for example.
- Creation of authentication via APIs such as OAuth 2.0, JWT tokens, third-party services such as Firebase or API Key.
- Migration to cloud with reverse proxies such as NGINX, Apache or Traefik to manage multiple micro services and HTTPS, among others.
Web languages I'm used to working with :
Flask in Python and Node.JS in Javascript.
Cloud Architecture
- Deploying services on Cloud services via Docker
- Network configuration of services
- Security: e.g. configuration of firewalls, IP whitelist, backup, mirroring, HTTPS, etc.
- Versioning through Git
- Setting up tests
Cloud services I'm used to working with :
Microsoft Azure, Google Cloud Platform (GCP), OVH, Amazon Web Services (AWS) and Oracle
Non-technical skills
- Definitions of business models around data and a strategy for using data.
- Training in data manipulation tools and programming languages.
- Popularising of technical subjects
Client References

Client based in London so mission in english. ETL development in Python on Airflow. Migration of hundreds of SQL queries (used for data transformation) to DBT. Big Query datawarehouse managament and liaison with the business teams. Creation of dashboards on Looker Studio (ex Google DataStudio).
See more-
Senior Data Engineer Freelance in Full Remote with travels to London and Paris.
- Maintenance and development of DAG Airflow in Python to extract data from CRM, backends, databases and several other systems.
- Migration of hundreds of SQL queries to DBT and configuration of the environment.
- Creation of a synchronisation script between the datawarehouse and the Oracle Netsuite ERP (accounting software).
- Development of SQL queries to open data in Looker Studio (ex Google Data Studio)
- Managing the datawarehouse on BigQuery and the Google Cloud Storage datalake
- Creation of dashboards and automated exports (with app script on GSheet or MAKE) for the business.

Creation of adashboard to interpret the analysis data collected.
See more-
Project for CNRS innovation
- Gathering of requirements and indicators to be put forward to the lab teams involved
- Data formatting and connection to external data sources (via API) for enrichment.
- Creating the platform in Python with the Dash library.
- Deployment physically within the lab to guarantee access to those involved

Supporting the departmental tourism committee in putting an Open Data platform online. Creation of data extractions and transformations from the agency's partners. Development of a bespoke CRM for la gestion des projets.
See more- Collection of data requirements within the agency.
- Creation of extraction, transformation and load pipelines from tourism data sources in Python.
- Development of a bespoke project management tool (CRM) in React.JS with an API in the backend to interface with a MongoDB database.
- Configuration of the Open Data platform using the OpenDataSoft tool and connection to the various data sources.

Development of simulation software pedestrian flows in waste collection centres in order to optimise skip positioning and construction.
See moreDuration : 6 months part-time
- Simulation software in Java
- Statistical data analysis inPython

Development of data transformation and load (ETL) tunnels from Slack and Google APIs to PostgreSQL databases on Azure. Development of attention prediction models based on metadata from messages, emails, calendars and video-conferences.
See more-
Creation of the start-up's first technical architecture and tech strategy.
- Collection of requirements to draw up a data schema then creation of databases
- Extraction via API interfaces of Slack and Google data in order to recover all the metadata of messages, emails, calendars and visios. Creation of data transformation tunnels in Python. Configuring OAuth2.
- Putting the architecture into production on the Azure Cloud and comparative study of cloud offerings. Configuration of visualisation (Redash), automation (N8N) and data management (NocoDB) tools on servers. Configurations on Docker.
- Securing the architecture: setting up automatic mirroring and backups on the databases, database encryption, IP whitelisting, firewall configuration.
- Development of algorithms to measure user attention based on metadata.

Implementation of a routine for collecting and transforming large volumes of XML files and adding hundreds of millions of rows in a PostgreSQL database for subsequent use with Tableau software.
See more- Development of collection and transformation scripts in Python and Bash (GB of XML files to be formatted)
- Configuration of the PostgreSQL database in development and production environments (hundreds of millions of rows to be stored)
- Daily follow-up with the client and documentation writing.
- Deployment on OVH servers.
- Data visualisation with Tableau software

Data science consulting for Onepoint group. Development of Machine Learning prototypes and definition of Artificial Intelligence use cases for their clients.
See more- Creation of benchmark customer documents, monitoring and research into use cases
- Development of prototypes in Python, often with a Web component, to arrive at a finished product.
- Natural language processing, computer Vision and recommendation algorithms.
- Development of servers on the Microsoft Azure cloud and Raspberry Pi

Consulting in data science and AI as Freelancer for the Big Four consulting firm Deloitte. Development of a collection and analysis architecture able to handle big volumes. The purpose was to analyse automobile market for one of their client.
See more- Implementation of a data collection architecture on Amazon Web Services
- Creation of scraping scripts in Python with workarounds and proxies to change IP addresses. automatically) in Python with workarounds and proxies to change IPs.
- Analysis of the large volumes of data collected
- Drafting of a report on our client's competition based on the statistics obtained

TV programme rating website for the deaf and hard of hearing. Creation of a backend server with an API interface for Unanimes in partnership with Microsoft et Bakhtech.
See more- Back-End written in Node.JS with REST API
- MongoDB database
- Deployment on Azure cloud via Git with a Docker architecture

Creation of Webhook/API interfaces to collect prospect data from Facebook/Instagram ads into a CRM (Customer Relationship Management) database.
See moreDuration : 2 weeks
- API REST server with Flask
- Webhook and Facebook API configuration
- Use of JSON forms
- Facebook for Business Interface to manage ads
Experiences and Education

Independant
IT services for all types of companies
Developing projects for processing and adding value to my clients' data
See skills
See references
Start-up specialised in transport for people with reduced mobility.
- Implementing the strategy with my partners, who develop the commercial side and collect user requirements.
- Creation of the company from a legal point of view, management of accounting and administration.
Development of the information system and services :
- Development of the architecture for collecting, transforming and storing cartographic (GIS) data with opening via REST and GraphQL APIs. Deployment on 2 Cloud providers. (Architecture and Back-end)
- Programming of 3 websites with React.JS and JQuery for clients and data management tools in-house. (visualisation tools)

DELOITTE Consulting
Analytics & Information Management (AIM) business unit, which develops and implements Artificial Intelligence (AI) projects.
- Development of machine learning algorithms on text (Natural Language Processing) for automatic mail classification.
- Programming of an automatic invoice processing prototype with image to text extraction algorithms (OCR) and with the text obtained from named entity recognition algorithms. (Named Entity Recognition)
- Client assignment involving the collection and analysis of large volumes of data in the automotive sector.
Start-up specialised in hyper-spectral image capture from drones
Development of a deep learning strategy for a pattern recognition application using a miniaturised hyperspectral sensor on a smartphone.
- Use of the Python package Tensorflow
- Virtualising the data processing environment with Docker
- Drafting of a report on possible use cases for Neural Networks for the startup.
Grenoble, France
- Grande Ecole Diploma :
- Accounting and corporate governance
- Digital Marketing
- TOEIC : 915/990
- Final thesis : "The business impact of the use of data and the ethical issues involved".
Brest, France
- Engineering Master of Science in Data Science :
- Machine Learning
- Business Intelligence
- Big Data
- TOEFL : 620/677
Shanghai Jiao Tong University
(SJTU)Shanghai, China
- 6-month university exchange
- Data Science / Big Data
Preparatory classes
Math-Physics (PSI*)High school Joffre Montpellier, France
Awards
Andyamo - BFM
TV | January 2020
Mines Télécom Institute certification
INRIA (French AI institute) Certification