• Home
  • Page : 'projets' inconnue (menus.php)

Projects

Team : ALSOC

  • TSAR - TSAR (Tera-Scale ARchitecture)

    Project leader : Alain Greiner
    09/27/2017
    More details here

Team : BD

  • EPIQUE - Reconstructing the long terms evolution of science through large scale analysis of science productions

    The evolution of scientific knowledge is directly related to the history of humanity. Document archives and bibliographic sources like the “Web Of Science” or PubMed contain a valuable source for the analysis and reconstruction of this evolution. The proposed project takes as starting point the contributions of D. Chavalarias and J.P. Cointet about the analysis of the dynamicity of evolutive corpora and the automatic construction of “phylomemetic” topic lattices (as an analogy with genealogic trees of natural species). Currently existing tools are limited to the processing of medium sized collections and a non interactive usage. The expected project outcome is situated at the crossroad between Computer science and Social sciences. Our goal is to develop new highly performant tools for building phylomemetic maps of science by exploiting recent technologies for parallelizing tasks and algorithms on complex and voluminous data. These tools are conceived and validated in collaboration with experts in philosophy and history of science over large scientific archives. Keywords : quantitative epistemology, evolution of science, topic detection, temporal alignment, big data processing, data science, big data

    Project leader : Bernd AMANN
    01/01/2017
    More details here
  • ANR EPIQUE

    Vers une épistémologie quantitative - Reconstruire l'évolution des sciences à grande échelle L'évolution des connaissances scientifiques est directement liée à l'histoire de l'humanité. Les archives documentaires et bibliographiques comme le « Web Of Science » (WoS) ou PubMed représentent des sources fécondes pour l’analyse et la reconstruction de cette évolution. Le projet de recherche proposé part des travaux de D. Chavalarias et J.-P. Cointet sur l’analyse de la dynamique de corpus évolutifs et la construction automatique de treillis «phylomémétiques» de topics (en analogie avec l'arbre généalogiques des espèces). Les outils actuels sont limités au traitement de corpus de taille moyenne et à une utilisation non interactive. Les contributions attendues se situent à la fois dans les domaines de l'Informatique et des Sciences humaines et sociales. Notre objectif est de développer des solutions performantes pour générer et interagir avec des cartes phylomémétiques qui exploitent les avancées technologiques récentes pour la parallélisation des tâches et des algorithmes sur des données complexes et volumineuses. Ces solutions sont conçues et et validées en collaboration avec des experts en philosophie et histoire des sciences sur différents corpus de grande taille. Mots clés: épistémologie quantitative, évolution des sciences, détéction de topics, alignement temporel, traitements de données à large échelle, sciences des données, big data, scientométrie

    Project leader : Bernd Amann
    10/01/2016
    More details here
  • experimaestro - Computer science experiment scheduler and manager

    Experimaestro is an experiment manager based on a server that contains a job scheduler (job dependencies, locking mechanisms) and a framework to describe the experiments with JavaScript or in Java.

    Project leader : Benajmin PIWOWARSKI
    01/01/2016
    More details here
  • SPARQL on Spark - SPARQL query processing with Apache Spark

    A common way to achieve scalability for processing SPARQL queries over large RDF data sets is to choose map-reduce frameworks like Hadoop or Spark. Processing complex SPARQL queries generating large join plans over distributed data partitions is a major challenge in these shared nothing architectures. In this article we are particularly interested in two representative distributed join algorithms, partitioned join and broadcast join, which are deployed in map-reduce frameworks for the evaluation of complex distributed graph pattern join plans. We compare five SPARQL graph pattern evaluation implementations on top of Apache Spark to illustrate the importance of cautiously choosing the physical data storage layer and of the possibility to use both join algorithms to take account of the existing predefined data partitionings. Our experimentations with different SPARQL benchmarks over real-world and synthetic workloads emphasize that hybrid join plans introduce more flexibility and often can achieve better performance than join plans using a single kind of join implementation.

    Project leader : Hubert NAACKE
    01/01/2015
    More details here
  • BOM - Block-o-Matic!

    Block-o-Matic is a web page segmentation algorithm based on an hybrid approach for scanned document segmentation and visual-based content segmentation. A web page is associated with three structures: the DOM tree, the content structure and the logical structure. The DOM tree represents the HTML elements of a page, the geometric structure organizes the content based on a category and its geometry and finally the logical structure is the result of mapping content structure on the basis of the human-perceptible meaning that conforms the blocks. The segmentation process is divided in three phases: analysis, understanding and reconstruction of a web page. An evaluation method is proposed in order to perform the evaluation of web page segmentations based on a ground truth of 400 pages classified into 16 categories. A set of metrics are presented based on geometric properties of blocks. Satisfactory results are reached when comparing to other algorithms following the same approach.

    Project leader : Andrès SANOJA
    01/01/2012
    More details here

Team : CIAN

  • https://www.lip6.fr/coriolis
    CORIOLIS - Platform for physical synthesis of integrated circuits

    Coriolis is an experimental integrated platform for the research, development and evaluation of new back-end VLSI design flows. Interconnect scaling to nanometer processes presents many difficult challenges to CAD flows. Currently academic research on back-end tend to address only specific algorithmic issues separately, although one key issue to address is the cooperation of multiple algorithmic tools. CORIOLIS, our platform, is based on an integrated C++ database around which all tools consistently interact and collaborate. This platform currently includes a timing-driven global place and route flow.

    Project leader : Jean-Paul CHAPUT
    01/01/2004
    More details here
  • CAIRO - Analog IP Design

    Our purpose is to provide a language for designing generators of analog functions, that can be easily ported to new set of specfications and new technologogy processes. We are currently developing such a language that is called CAIRO+ The CAIRO+ language supports the four steps of a design flow based on net-list and layout templates. This language is aimed to help the designer to capture his knowledge, thus creating a library of layout-aware analog functions. It is based on C++ language. The design flow relevant to CAIRO+ is the following : ->net-list and layout template capture, ->design space exploration (managing electrical constraints) ->shape function computation (managing geometrical constraints) ->layout generation (place and route) CAIRO+ allows creating complex hierarchical analog function generators by using existing generators of simpler functions. It is an answer to the problem of Analog and Mixed IPs. As a demonstration of the CAIRO+'s capabilities, we are developping Analog to Digital converters, specially Sigma Delta.

    Project leader : Marie-Minerve LOUĂ‹RAT
    01/01/2004

Team : DECISION

  • http://www-desir.lip6.fr/~gonzales/research/lemon
    Lemon - library for easily modeling and operating on networks

    Lemon is a GUI toolkit written in C++ for manipulating graphical models. It relies on aGrUM both for graph theoretic algorithms and for computations within the graphical models (e.g., inference, learning).

    Project leader : Christophe GONZALES

    More details here
  • http://agrum.lip6.fr
    aGrUM - a Graphical Unified Model

    aGrUM is a C++ library designed for easily manipulating graphical models. Its range of applications is quite large as it is designed, e.g., for performing learning tasks (for instance, learning Bayes nets from data), planning tasks (FMDPs) and inference (Bayes nets, GAI-nets, influence diagrams).

    Project leader : Christophe GONZALES & Pierre-Henri WUILLEMIN

    More details here

Team : LFI

  • AXA Research Funds JRI

    Interpretability for human-friendly machine learning models

    Project leader : Christophe Marsala
    03/01/2017

Team : MOCAH

  • Adaptiv’Math - Adaptiv’Math

    obtenu dans le cadre du Partenariat d'Innovation Intelligence Artificielle (P2IA) du ministère de l'éducation nationale et porté par la startup EvidenceB, implique des entreprises (Nathan, Daesign, Schoolab, Isograd, BlueFrog), deux laboratoires (LIP6 et Inria Bordeaux), l'APMEP (association des professeurs de mathématiques) ainsi que des chercheurs en psychologie cognitive (E. Sander) et en neurosciences (A. Knopf). Il vise à réaliser un assistant pédagogique pour les mathématiques du Cycle 2 (CP, CE1, CE2) s'appuyant sur des algorithmes d'IA et sur un ensemble d'exercices définis à partir d'avancées en sciences cognitives. Nous travaillons sur une brique IA visant à proposer des regroupements d'élèves (textit{clustering}) appris sur l'ensemble des classes sur la base de critères de maîtrise de compétences en mathématiques. Ce textit{clustering} est ensuite appliqué classe par classe à intervalles réguliers pour proposer à l'enseignant un suivi de l'évolution de ses groupes d'élèves, afin de faciliter la mise en place de stratégies de pédagogie différenciée.

    Project leader : François Bouchet
    10/01/2019
    More details here
  • SmashMedicine - SmashMedicine

    projet EIT Health Campus SmashMedicine porté par Oxford, sur l'analyse de base de données médicales pour l'aide à la génération de bons distracteurs pour des QCM d'entraînement pour les étudiants de médecine.

    Project leader : François Bouchet
    09/01/2019
  • MindMath - MindMath

    MindMath (financé par la BPI et la région IdF) est un projet de plateforme gamifiée et adaptative pour l’apprentissage des mathématiques au collège. Ce projet implique plusieurs partenaires industriels (Cabrilog, Tralalere, Domoscio et Bayard) et académiques (LDAR - Université Paris Diderot et LIP6 - Sorbonne Université). Nous développons des algorithmes pour décider, en fonction des activités des élèves au sein de tâches de résolution de problèmes en mathématiques, du feedback le plus adapté pour les aider à progresser dans leurs apprentissages. La décision s’appuie à la fois sur une ontologie construite avec des experts en didactique des mathématiques et sur des approches d’apprentissage automatique. La recherche des feedbacks optimaux se fait par apprentissage par renforcement, avec un système de récompense basé sur la réussite des élèves dans les activités. Ces propositions sont expérimentées dans différents contextes scolaires et parascolaires.

    Project leader : Amel Yessad
    01/01/2019
  • IECARE - IECARE

    IECARE est un projet de recherche financé par l'ANR. Il vise à produire des connaissances fondamentales et opératoires sur l’informatique, son enseignement et son apprentissage, à l’école obligatoire. Ce projet pluridisciplinaire associe des chercheurs en Informatique et en sciences humaines et sociales (Sciences de l’éducation, didactiques, psychologie des apprentissages, sociologie). La recherche suit trois thèmes structurant : analyser les représentations et les pratiques des enseignants et des élèves ; modéliser, concevoir et modifier des scénarios pédagogiques et des ressources pour soutenir les pratiques d’enseignement et d’apprentissage ; étudier les cadres d’accompagnement mis en place par et pour les enseignants et les formateurs en informatique.

    Project leader : Mathieu Muratet
    01/01/2019
  • MAGAM - Multi-Aspect Generic Adaptation Model

    MAGAM is a Multi-Aspect (didactic, pedagogic, affective and motivational, gaming, etc.) Generic Adaptation Model based on matrix calculation that aims to adapt learning activities.

    Project leader : Vanda LUENGO et Baptiste MONTERRAT
    03/01/2016
    More details here
  • LEA4PA - LEarning Analytics for Adaptation and Personnalisation

    This project aims to built a plateform to assist teachers in adapting learning activities. Multiple indicators (cognitive, pedagogical, temporal, etc. will be inferred from data traces that are recorded and generated automatically or manually from the learner activities. Visualization systems will be proposed to assist teachers in their activities' adaptation process and make it.

    Project leader : Vanda LUENGO et Amel YESSAD
    01/01/2016
  • RecoMOOC - Recommending people to people in MOOCs

    RecoMOOC aims at reducing attrition and improving the learning experience for MOOC learners through interaction with other students they can talk to and work with, identified automatically through a peer recommender system analyzing each learner's individual factors and progression.

    Project leader : François BOUCHET
    01/01/2014

Team : MoVe

  • PARDI

    PARameterized DIstributed Systems Verification

    Project leader : Pascal Poizat
    Partners : INPT/IRIT (leader), Inria/LORIA, UPS/LRI, UPMC/LIP6
    01/10/2016
  • http://pnml.lip6.fr
    PNML Framework

    PNML Framework is a prototype implementation of ISO/IEC-15909 part 2, International Standard on Petri Net Markup Language. The primary purpose of PNML is to enable interoperability among Petri net tools. PNML framework has thus been designed to back the Standard. It will enable Petri nets tools developers to seamlessly integrate PNML support into their tools. It provides an extensive and comprehensible API to create, save, load and browse PNML models.

    Project leader : Fabrice KORDON
    04/01/2005
    More details here
  • https://www.lip6.fr/cpn-ami
    CPN-AMI

    is a Petri Net based CASE environment. It offers a set of services to perform specification, validation, formal verification, model checking, compute structural properties (invariants, traps, syphons etc.) simulate and generate code. These services have been implemented either by members of our team or university partners (Technical university of Helsinki, University of Torino, Technical university of Munchen, Bell laboratories). The second geration of CPN-AMI, build on top of FrameKit, is available on the Internet since March 1997.

    Project leader : Fabrice KORDON
    12/01/1994
    More details here
  • http://spot.lip6.fr/
    SPOT - Spot Produces Our Traces

    SPOT (Spot Produces Our Traces) est une bibliothèque de model-checking facilement extensible. À la différence des model-checkers existants, dont le mode opératoire est immuable, SPOT fournit des briques que l'utilisateur peut combiner entre elles pour réaliser un model-checker répondant à ses propres besoins. Une telle modularité permet d'expérimenter facilement différentes combinaisons, et facilite le développement de nouveaux algorithmes. D'autre part, cette bibliothèque est centrée autour d'un type d'automates particulier permettant d'exprimer les propriétés à vérifier de façon plus compacte, qui n'a jamais été utilisé dans un outil jusqu'à présent.

    Project leader : Denis POITRENAUD

    More details here

Team : NPA

  • F-Interop - Remote interoperability testing services for IoT devices

    Remote interoperability testing services for IoT devices

    Project leader : Serge FDIDA
    01/01/2016
    More details here

Team : PEQUAN

  • https://www.lip6.fr/cadna/
    CADNA - Control of Accuracy and Debugging for Numerical Application

    CADNA is a library which allows to perform scientific computations with the estimation and the control of the round-off error propagation.

    Project leader : Jean-Marie CHESNEAUX
    01/10/1992
    More details here

Team : QI

  • Photonics Quantum Sampling Machine

    Randomness is a resource that enables applications such as efficient probabilistic algorithms, numerical integration, simulation, and optimization. In the last few years it was realized that quantum devices can generate probability distributions that are inaccessible with classical means. Hybrid Quantum Computational models combine classical processing with these quantum sampling machines to obtain computational advantage in some tasks. Moreover, NISQ (Noisy, Intermediate-Scale Quantum) technology may suffice to obtain this advantage in the near term, long before we can build large-scale, universal quantum computers. PHOQUSING aims to implement PHOtonic Quantum SamplING machines based on large, reconfigurable interferometers with active feedback, and state-of-the-art photon sources based both on quantum dots and parametric down-conversion. We will overview the different architectures enabling the generation of these hard-to-sample distributions using integrated photonics, optimizing the designs and studying the tolerance to errors. We will build two quantum sampling machines with different technologies, as a way to do cross-checks while exploiting all advantages of each platform. These machines will establish a new state-of-the-art in photonic reconfigurability, system complexity, and integration. Finally, we plan to perform first, proof-of-principle demonstrations of Hybrid Quantum Computation applications in optimization, machine learning, and graph theory. The PHOQUSING team includes long-term scientific collaborators who were among the first to demonstrate quantum photonic samplers; two of the leading European start-ups in the relevant quantum technologies; and theoretical experts in photonics and quantum information science. This project will help establish photonics as a leading new quantum computational technology in Europe, addressing the science-to-technology transition towards a new industrial sector with a large foreseeable economic impact.

    Project leader : Elham Kashefi
    09/01/2020
  • QUANGO

    The secure exchange of sensitive information is of crucial importance in our society, as it protects money transfer, commercial transactions, medical data, remote control of critical infrastructures (power grid, telecom etc.). On the security side, Quantum Key Distribution (QKD) is a protocol based on the laws of quantum mechanics that guarantees unconditional security in the sharing of random keys to be used for secure communication. On the communication side, 5G is the fifth generation of wireless communication, that will soon become the standard for virtually every major telecommunication service provider, as defined by the standards organization 3GPP (3rd Generation Partnership Project). Both QKD and 5G technologies require a satellite network for a global coverage of their services. QUANGO will provide a new paradigm to share the satellite infrastructure required for secure communication based on quantum technologies and for 5G communication.

    Project leader : Eleni Diamanti
    07/30/2020
  • QENCLAVE

    CNRS intermédiaire entre la Région (financeur) et SU pour le financement du projet PHD2 "QEnclave". 100 000€ alloués à l'embauche d'un doctorant +missions. Les frais de gestion seront pris sur le contrat de collaboration associé (C19/0811) conclu avec l'entreprise Veriqloud. recrutement de Yao MA - S19JRCN003

    Project leader : Elham Kashefi
    07/03/2019
Archives