PhD graduated
Team : MLIA
Departure date : 09/30/2017

Supervision : Matthieu CORD

Co-supervision : THOME Nicolas

Gaze-Based Weakly Supervised Localization for Image Classification: Application to Visual Recognition in a Food Dataset

In this dissertation, we discuss how to use the human gaze data to improve the performance of the weak supervised learning model in image classification. The background of this topic is in the era of rapidly growing information technology. As a consequence, the data to analyze is also growing dramatically. Since the amount of data that can be annotated by the human cannot keep up with the amount of data itself, current well-developed supervised learning approaches may confront bottlenecks in the future. In this context, the use of weak annotations for high-performance learning methods is worthy of study. Specifically, we try to solve the problem from two aspects: One is to propose a more time-saving annotation, human eye-tracking gaze, as an alternative annotation with respect to the traditional time-consuming annotation, e.g. bounding box. The other is to integrate gaze annotation into a weakly supervised learning scheme for image classification. This scheme benefits from the gaze annotation for inferring the regions containing the target object. A useful property of our model is that it only exploits gaze for training, while the test phase is gaze free. This property further reduces the demand of annotations. The two isolated aspects are connected together in our models, which further achieve competitive experimental results.

Defence : 09/29/2017 - 10h30

Jury members :

M. Patrick Le Callet, Université de Nantes/Polytech Nantes [Rapporteur]
M. Philippe-Henri Gosselin, Université de Cergy-Pontoise/ENSEA [Rapporteur]
Mme Catherine Achard, Université Pierre et Marie Curie
M. Chaohui Wang, Université Paris-Est Marne-la-Vallée
M. Fréderic Precioso, Université Nice Sophia Antipolis
M. Nicolas Thome, Conservatoire National des Arts et Métiers
M. Matthieu Cord, Université Pierre et Marie Curie

2015-2019 Publications