Home Photo Florian Blanchet

Imagine projects
that make a difference.

About

Developer in all forms of data processing: from visualisation to the creation of processing processes through to machine learning algorithms (AI).

I've been working as a independent data engineer since 2018 for consulting firms , large companies, startups and SMEs.

I support my clients on their digital transformation trajectory gthanks to the diversity of my skills acquired over the course of my assignments in, data processing (ETL), setting up data storage architecture de on the Cloud and all types of data enhancement (visualisation, machine learning algorithms, automation). As I'm heavily involved, I keep a close eye on the level of performance expected by my clients.

I love the variety of encounters and challenges, as well as taking part in new technological challenges that have a positive impact. Thanks to my entrepreneurial activity and some of my clients, I've been able to help promote accessibility and digital access for people with disabilities.

I'm based in Montpellier, South of France, but my work takes me to Paris on a regular basis and I work most of the time remotely. I'm always on the lookout for new technologies and topics, which I share via my Blog.

Don't hesitate to contact me if you would like to discuss your data, a project or simply or simply contact me.

Skills

Data extraction and transformation

  • Development of data processing pipelines ETL (Extract Transform Load) in Python.
  • Implementation of tools such as DBT (SQL queries orchestration) and Airflow (ETL Python).
  • Data extraction from third-party services via API interface (such as Google, Slack or Teams) and storage services (databases, FTP servers, file storage).
  • Implementation of data Scraping scripts to extract data from websites

Databases configuration

  • Definition with the client of the data schema with the attributes and indicators relevant to end use
  • Production launch on Cloud services and security
  • Development of connectors to insert data
  • Writing of SQL queries

Types of databases I'm used to working with :


Datawarehouse(BigQuery), DataLake(Bucket S3, Google Cloud Storage ou FTP), PostgreSQL (relationnelle en SQL), MongoDB (document oriented), InfluxDB (time series) and Neo4j (graph oriented)

Making the most of existing data

  • Dashboard creation (Data Studio, Tableau, PowerBI)
  • Development of predictive algorithms (Machine Learning) in Python: Clustering, Neural networks et regressions for exemple.
  • Creation of automations using NoCode tools such as Zapier, Make or N8N for well-defined use cases.

Outils de Business Intelligence avec lesquels j'ai l'habitude de travailler :


Google Data Studio, Dash, Grafana, Tableau Software et PowerBI

Création de Backend Web / API

  • Development of API servers to open data from a database, for example.
  • Creation of authentication via APIs such as OAuth 2.0, JWT tokens, third-party services such as Firebase or API Key.
  • Migration to cloud with reverse proxies such as NGINX, Apache or Traefik to manage multiple micro services and HTTPS, among others.

Web languages I'm used to working with :


Flask in Python and Node.JS in Javascript.

Cloud Architecture

  • Deploying services on Cloud services via Docker
  • Network configuration of services
  • Security: e.g. configuration of firewalls, IP whitelist, backup, mirroring, HTTPS, etc.
  • Versioning through Git
  • Setting up tests

Cloud services I'm used to working with :


Microsoft Azure, Google Cloud Platform (GCP), OVH, Amazon Web Services (AWS) and Oracle

Non-technical skills

  • Definitions of business models around data and a strategy for using data.
  • Training in data manipulation tools and programming languages.
  • Popularising of technical subjects

Client References

Background site deepreach.com
DeepReach - Online Ads (AdTech)

Technical support for the design and improvement of advertising campaign management systems. Backend development mission (Python Flask Serverless on AWS and DB PostgreSQL) and data transformation (DBT, SQL et Airbyte).

See more
Details Duration : 4 months full time and full remote. Team based in Paris
    Senior Backend Developer / Data Engineer Freelance.
  • Technical support for the design and improvement of advertising campaign management systems.
  • Serverless environment on AWS in Python / SQL via Flask and PostgreSQL. Particular attention to code quality with unit and e2e tests and a robust CI/CD chain.
  • Creation of data pipelines in Python and via DBT. Set up CI/CD with terraform and Github Actions to update certain extractions using Airbyte.
  • Maintenance and improvement of the backend for processing the company's financial data.
  • Implementation of tools to manage margins and financial indicators.

Go to the DeepReach website

Background site onefinestay.com
Onefinestay (Accor Group) - Luxury hotels

Client based in London so mission in english. ETL development in Python on Airflow. Migration of hundreds of SQL queries (used for data transformation) to DBT. Big Query datawarehouse managament and liaison with the business teams. Creation of dashboards on Looker Studio (ex Google DataStudio).

See more
Details Duration : 2 years full time and full remote. Team based in London, UK.
    Senior Data Engineer Freelance in Full Remote with travels to London and Paris.
  • Maintenance and development of DAG Airflow in Python to extract data from CRM, backends, databases and several other systems.
  • Migration of hundreds of SQL queries to DBT and configuration of the environment.
  • Creation of a synchronisation script between the datawarehouse and the Oracle Netsuite ERP (accounting software).
  • Development of SQL queries to open data in Looker Studio (ex Google Data Studio)
  • Managing the datawarehouse on BigQuery and the Google Cloud Storage datalake
  • Creation of dashboards and automated exports (with app script on GSheet or MAKE) for the business.

Visit website of Onefinestay

Dashboard made for CNRS
CNRS - Laboratory ChimEco (MUSE)

Creation of adashboard to interpret the analysis data collected.

See more
Details
    Project for CNRS innovation
  • Gathering of requirements and indicators to be put forward to the lab teams involved
  • Data formatting and connection to external data sources (via API) for enrichment.
  • Creating the platform in Python with the Dash library.
  • Deployment physically within the lab to guarantee access to those involved
CRM made for Département du Nord
French North Region

Supporting the departmental tourism committee in putting an Open Data platform online. Creation of data extractions and transformations from the agency's partners. Development of a bespoke CRM for la gestion des projets.

See more
Details Duration :1 year and a half part-time
  • Collection of data requirements within the agency.
  • Creation of extraction, transformation and load pipelines from tourism data sources in Python.
  • Development of a bespoke project management tool (CRM) in React.JS with an API in the backend to interface with a MongoDB database.
  • Configuration of the Open Data platform using the OpenDataSoft tool and connection to the various data sources.
icon Suez
Suez

Development of simulation software pedestrian flows in waste collection centres in order to optimise skip positioning and construction.

See more
Details

Duration : 6 months part-time
  • Simulation software in Java
  • Statistical data analysis inPython
BEN startup website
BEN - HR Startup

Development of data transformation and load (ETL) tunnels from Slack and Google APIs to PostgreSQL databases on Azure. Development of attention prediction models based on metadata from messages, emails, calendars and video-conferences.

See more
Details Duration : 6 months part-time
    Creation of the start-up's first technical architecture and tech strategy.
  • Collection of requirements to draw up a data schema then creation of databases
  • Extraction via API interfaces of Slack and Google data in order to recover all the metadata of messages, emails, calendars and visios. Creation of data transformation tunnels in Python. Configuring OAuth2.
  • Putting the architecture into production on the Azure Cloud and comparative study of cloud offerings. Configuration of visualisation (Redash), automation (N8N) and data management (NocoDB) tools on servers. Configurations on Docker.
  • Securing the architecture: setting up automatic mirroring and backups on the databases, database encryption, IP whitelisting, firewall configuration.
  • Development of algorithms to measure user attention based on metadata.
Transformationd de données XML
Budaviz - Financial Startup

Implementation of a routine for collecting and transforming large volumes of XML files and adding hundreds of millions of rows in a PostgreSQL database for subsequent use with Tableau software.

See more
Details
  • Development of collection and transformation scripts in Python and Bash (GB of XML files to be formatted)
  • Configuration of the PostgreSQL database in development and production environments (hundreds of millions of rows to be stored)
  • Daily follow-up with the client and documentation writing.
  • Deployment on OVH servers.
  • Data visualisation with Tableau software
icone onepoint
OnePoint - Consulting

Data science consulting for Onepoint group. Development of Machine Learning prototypes and definition of Artificial Intelligence use cases for their clients.

See more
Details Duration :8 months part-time freelance working 100% from home
  • Creation of benchmark customer documents, monitoring and research into use cases
  • Development of prototypes in Python, often with a Web component, to arrive at a finished product.
  • Natural language processing, computer Vision and recommendation algorithms.
  • Development of servers on the Microsoft Azure cloud and Raspberry Pi
Icone Deloitte
Deloitte - Consulting

Consulting in data science and AI as Freelancer for the Big Four consulting firm Deloitte. Development of a collection and analysis architecture able to handle big volumes. The purpose was to analyse automobile market for one of their client.

See more
Details Duration : 1 months full-time freelance working 100% from home
  • Implementation of a data collection architecture on Amazon Web Services
  • Creation of scraping scripts in Python with workarounds and proxies to change IP addresses. automatically) in Python with workarounds and proxies to change IPs.
  • Analysis of the large volumes of data collected
  • Drafting of a report on our client's competition based on the statistics obtained
Visuel du site CommonTV
Partnership Unanimes x Microsoft

TV programme rating website for the deaf and hard of hearing. Creation of a backend server with an API interface for Unanimes in partnership with Microsoft et Bakhtech.

See more
Details
  • Back-End written in Node.JS with REST API
  • MongoDB database
  • Deployment on Azure cloud via Git with a Docker architecture

Visit website

ETL pipeline
Anacrouse

Creation of Webhook/API interfaces to collect prospect data from Facebook/Instagram ads into a CRM (Customer Relationship Management) database.

See more
Details

Duration : 2 weeks
  • API REST server with Flask
  • Webhook and Facebook API configuration
  • Use of JSON forms
  • Facebook for Business Interface to manage ads

Awards

Avis Guillaume Fontana

CTO à Onefinestay (Groupe Accor)

Florian has been working with us for 2 years. Initially, it was for a 6 months contract, consisting in migrating our data pipelines to work with the new systems we were implementing at time. Florian did very well, understood the business concepts quickly, our tech environment as well, meaning that the job was achieved in time, although it was not very easy with the amount of legacy transformation queries we had at the time. Since everything was going well, and that we had growing internal needs, I decided to continue working with Florian. He continued to clean up our legacy systems, and he migrated our whole ELT processes into DBT. He prepared training slides, and trained the teams internally. The job was achieved in less than 2 months. This greatly improved our ways of working, thus the internal teams satisfaction. Florian also worked on pushing our financial data coming from our backoffices to NetSuite. This was a very complex project, with multiple NetSuite customization needed. Florian never abandoned, even though it was really difficult to get all the required information sometimes. Florian worked on many other structural and strategic projects during these 2 years, and he never let me down. The only reason we had to stop working together is because we wanted to internalize the position, but Florian wanted to remain as a contractor, which I can fully understand. Florian is a true gem, he can work on very complex projects, and go to the bottom of them, without any doubts. He is also a very nice person, which is always a nice addition :) We will miss him, for sure.

Avis Anne Marie

Gérante à Budaviz

En plus d'un relationnel très agréable, Florian est consciencieux, rigoureux, à l'écoute tout en étant force de proposition. Florian a très vite compris mes besoins, à savoir la mise en place d'une base de données regroupant certaines données publiques que je souhaitais exploiter sur Tableau software. Il a réalisé plusieurs scripts permettant d'automatiser la mise à jour quotidienne de la base de données et fait le nécessaire pour installer ensuite celle-ci sur un serveur. Il m'a épaulée dans le choix d'une formule d'hébergement pour le serveur. Le travail rendu est très propre et bien documenté. Un vrai talent que je recommande totalement.

Avis Jean-Marc

Fondateur Finance BI

Expérience fluide et efficace avec Florian, je le recommande !

Interview Going Freelance

Freelance Media | April 2019


Andyamo - Les Echos

PR | November 2019

Andyamo - Le Parisien

PR | June 2019

Andyamo - BFM

TV | January 2020

Principle of data networks

Mines Télécom Institute certification

Semantic Web and Web of data

INRIA (French AI institute) Certification

Experiences and Education

Bakcground indépendant

Independant

Data Engineer Freelance

IT services for all types of companies

Developing projects for processing and adding value to my clients' data


See skills
See references
September 2018 - Today | Full-remote / Montpellier, France ETL Databases Data Science AI
Co-Fonder et Chief Technical Director

Start-up specialised in transport for people with reduced mobility.

  • Implementing the strategy with my partners, who develop the commercial side and collect user requirements.
  • Creation of the company from a legal point of view, management of accounting and administration.

Development of the information system and services :
  • Development of the architecture for collecting, transforming and storing cartographic (GIS) data with opening via REST and GraphQL APIs. Deployment on 2 Cloud providers. (Architecture and Back-end)
  • Programming of 3 websites with React.JS and JQuery for clients and data management tools in-house. (visualisation tools)
April 2017 - December 2021 | Grenoble, France Data Engineering API GIS React.JS Entrepreneurship
icone deloitte

DELOITTE Consulting

Data & Analytics Consultant

Analytics & Information Management (AIM) business unit, which develops and implements Artificial Intelligence (AI) projects.

  • Development of machine learning algorithms on text (Natural Language Processing) for automatic mail classification.
  • Programming of an automatic invoice processing prototype with image to text extraction algorithms (OCR) and with the text obtained from named entity recognition algorithms. (Named Entity Recognition)
  • Client assignment involving the collection and analysis of large volumes of data in the automotive sector.
February 2018 - October 2018 | Paris, France Data Science Machine Learning Scraping Consulting
Deep Learning Internship

Start-up specialised in hyper-spectral image capture from drones

Development of a deep learning strategy for a pattern recognition application using a miniaturised hyperspectral sensor on a smartphone.

  • Use of the Python package Tensorflow
  • Virtualising the data processing environment with Docker
  • Drafting of a report on possible use cases for Neural Networks for the startup.
Février 2017 - Mars 2017 | Brest, France DeepLearning Python Tensorflow Neural Networks

Grenoble Business School

(GEM)

Grenoble, France

    Grande Ecole Diploma :
  • Accounting and corporate governance
  • Digital Marketing
  • TOEIC : 915/990
  • Final thesis : "The business impact of the use of data and the ethical issues involved".

IMT Atlantique

(ex Télécom Bretagne, Mines-Ponts schools)

Brest, France

    Engineering Master of Science in Data Science :
  • Machine Learning
  • Business Intelligence
  • Big Data
  • TOEFL : 620/677

Shanghai Jiao Tong University

(SJTU)

Shanghai, China

  • 6-month university exchange
  • Data Science / Big Data

Preparatory classes

Math-Physics (PSI*)

High school Joffre Montpellier, France

Contact

Blog