Projects [Archives]

Team : APR

  • MOPSA - Modular Open Platform for Static Analysis

    The Mopsa project aims at creating methods and tools to make computer software more reliable. Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. We wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software. We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core of Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.

    Project leader : Antoine MINÉ
    06/01/2016 to 05/2021
    More details here …
  • COVERIF - Combining abstract interpretation and constraint programming to verify critical properties of embedded programs with floating point computations

    Verifying correctness and robustness of programs and systems is a major challenge in a society which relies more and more on safety-critical systems controlled by embedded software. This issue is even more critical when the computations involve floating-point number arithmetic, an arithmetic known for its quite unusual behaviors, and which is increasingly used in embedded software. Note for example the "catastrophic cancellation" phenomenon where most of the significant digits of a result are cancelled or, numerical sequences whose limit is very different over the real numbers and over the floating-point numbers. A more important problem arises when we want to analyse the relationship between floating-point computations and an "idealized" computation that would be carried out with real numbers, the reference in the design of the program. The point is that for some input values, the control flow over the real numbers can go through one conditional branch while it goes through another one over the floating-point numbers. Certifying that a program, despite some control flow divergences, computes what it is actually expected to compute with a minimum error is the subject of the robustness or continuity analysis. Providing a set of techniques and tools for verifying the accuracy, correctness and robustness for critical embedded software is a major challenge. The aim of this project is to address this challenge by exploring new methods based on a tight collaboration between abstract interpretation (IA) and constraint programming (CP). In other words, the goal is to push the limits of these two techniques for improving accuracy analysis, to enable a more complete verification of programs using floating point computations, and thus, to make critical decisions more robust. The cornerstone of this project is the combination of the two approaches to increase the accuracy of the proof of robustness by using PPC techniques, and, where appropriate, to generate non-robust test cases. The goal is to benefit from the strengths of both techniques: PPC provides powerful but computationally expensive algorithms to reduce domains with an arbitrary given precision whereas AI does not provide fine control over domain precision, but has developed many abstract domains that quickly capture program invariants of various forms. Incorporating some PPC mechanisms (search tree, heuristics) in abstract domains would enable, in the presence of false alarms, to refine the abstract domain by using a better accuracy. The first problem to solve is to set the theoretical foundations of an analyser based on two substantially different paradigms. Once the interactions between PPC and IA are well formalized, the next issue is to handle constraints of general forms and potentially non-linear abstract domains. Last but not least, an important issue concerns the robustness analysis of more general systems than programs, like hybrid systems which are modeling control command programs. Research results will be evaluated on realistic benchmarks coming from industrial companies, in order to determine their benefits and relevance. For the explored approaches, using realistic examples is a key point since the proposed techniques often only behave in an acceptable manner on a given sub classes of problems (if we consider the worst-case computational complexity all these problems are intractable). That's why many solutions are closely connected to the target problems.

    Project leader : Eric GOUBAULT
    10/01/2015 to 09/2020
    More details here …
  • LCHIP - Low Cost High Integrity Platform

    Project LCHIP aims at largely facilitating the development of sure applications to high level of criticality while providing: - an environment of complete development allowing to generate and prove mathematically and automatically software with limited algorithmic, - a secure platform and at low cost for the execution of these application, in order to guarantee a maximum level of safety. Transparent integration with languages trade (DSL) and third line productions of code authorize a deployment without pain in existing development processes and support the exploitation of technology apart from the consortium.

    Project leader : Thierry LECOMTE
    10/01/2016 to 06/2020
    More details here …
  • GDRI-ALEA-NETWORK - ALEA-NETWORK

    ALEA-Network is a GDRI project funded by CNRS, and co-signed by 4 european Universities (Wien, Stockholm, Oxford, Munich). This project gathers researchers from Europe and some other countries all over the world, interested in the study of discrete random structures, and coming from various domains: Computer Science, Discrete Mathematics, Probability, Statistical Physics, Bio-Informatics. In this multidisciplinary context, the objective is to elaborate methods for quantifying alea and analyze statistical properties of fundamental combinatorial structures.

    Project leader : Michèle SORIA & Jean-François MARCKET
    01/01/2015 to 01/2019
    More details here …
  • ASSUME - Affordable Safe And Secure Mobility Evolution

    Future mobility solutions will increasingly rely on smart components that continuously monitor the environment and assume more and more responsibility for a convenient, safe and reliable operation. Currently the single most important roadblock for this market is the ability to come up with an affordable, safe multi-core development methodology that allows industry to deliver trustworthy new functions at competitive prices. ASSUME provides a seamless engineering methodology to overcome this roadblock. The problem is addressed on the constructive and on the analytic side. For efficient construction and synthesis of embedded systems, the project provides new tools, standards and methodologies to cover most of the challenges by design. In addition, ASSUME provides a well-integrated sound static analysis solution that allows proving the absence of problems even in a multi-core environment. New algorithms will be integrated in exploitable tools. New interoperability standards and requirements formalization standards will facilitate cooperation between different market players. The ASSUME consortium includes leading European industry partners for mobility solutions, tool and service providers for embedded system development as well as leading research institutes for static analysis for model-driven and traditional embedded systems development.

    Project leader : Udo GLEICH
    09/01/2015 to 08/2018
    More details here …
  • UCF - Ubiquitous Contents Framework

    Project leader : Jonathan RIVALAN
    10/01/2014 to 11/2017
    More details here …
  • MAGNUM - MĂ©thodes Algorithmiques pour la GĂ©nĂ©ration alĂ©atoire Non Uniforme: Modèles et applications

    The central theme of the MAGNUM project is the elaboration of complex discrete models that are of broad applicability in several areas of computer science. A major motivation for the development of such models is the design and analysis of efficient algorithms dedicated to simulation of large discrete systems and random generation of large combinatorial structures. Another important motivation is to revisit the area of average-case complexity theory under the angle of realistic data models. The project proposes to develop the general theory of complex discrete models, devise new algorithms for random generation and simulation, as well as bridge the gap between theoretical analyses and practically meaningful data models.

    Project leader : Michèle SORIA
    06/30/2010 to 06/2015
    More details here …
  • CERCLES2 - Compositional Certification of Critical and Safe embedded Softwares

    The CERCLES project is focused on the certification of safety critical embeded software. Its aim is to reduce the cost of this activity. The certification of safety critical embeded software is a costly industrial process. It has to follow a very long, rigourous and standardized process. For instance the DO178C standard of civil avionics describes the set of spécification (High Level Requirements), design (Architecture and Low Level Requirements), coding and verification activities that must be fulfilled to obtain the certification. The modern software development relies on assembling components. In such a case, the certification standard, in particular as testing is required in the verification activity, leads to cover twice the same way: a first time during the development process of the components and a second time when the coponents are reused in another system. The broad thrust of the CERCLES project is to capilalize on components certification in order to avoid the cost of covering twice the same way. The problemn is then the existence of methods trustful and tools for assembling components that are suitable for industry and capable to design system's architectures involving their certification. Programming languages have still proposed methods and tools (typing, objects, modules, contracts) to increase the safety of reusing software components. But they did not address the specific point of safety critical embeded software for which a verification activity that relies on testing (and its coverage measurement) is absolutly required. We must obtain that if two components A and B have been previously certified then we have a way to verify the assembly A-B. This entails that we are able to describe faithfully the units A and B as “components” offering some guaranted functionnal properties and some possibilty of reuse; the composition A-B as an assembly preserving the guaranties acquired by the components and realizing some other specified functionalities. On this basis we can expect to be abble to set both the preservation of correctness of the reused components individualy and the correctness of the composition with regards to its own requirements.

    Project leader : Philippe BAUFRETON
    04/01/2011 to 09/2014
    More details here …
  • PWD - Programming the Diffuse Web

    The goal of the PWD project is to propose new programming languages for developing new applications on the Web.

    Project leader : Manuel SERRANO
    11/01/2009 to 10/2013
    More details here …
  • OPENGPU - OPENGPU

    GPU are becoming a more and more promising solution to address the growing need in computing power for numerical applications, new conception methods or numerical simulations in relation with research works. Since 2005, the frequency raise in generalist CPU has led to the multiplication of cores rendering these solutions more difficult to use. Meanwhile, GPU have evolved in less specialized architectures making their use for calculation and data treatments feasible. This evolution came with the emergence of GPU’s unified architecture and, in 2008, by the finalization of the OpenCL standard offering an interesting perspective to exploit this architecture. The computing power and the power/energetic consumption ratio of the GPU being superior to standard CPU, the use of GPU represents an opportunity to have more power for a lower energetic cost. The Open GPU project allows to take advantage of this opportunity targeting a triple-goal: - Building up an integrated and open platform of Open Source tools in order to assist code parallelization - Testing the parallelization gains on industrial and academic applications - Designing the appropriate hardware and software architectures for the exploitation of these new computing powers and for the improvement of the energetic consumption The Open GPU project is composed of three complementary parts expected to synergistically feed each other.

    Project leader : Eric MAHE
    01/01/2010 to 01/2012
    More details here …

Team : BD

  • ANR EPIQUE

    Vers une épistémologie quantitative - Reconstruire l'évolution des sciences à grande échelle L'évolution des connaissances scientifiques est directement liée à l'histoire de l'humanité. Les archives documentaires et bibliographiques comme le « Web Of Science » (WoS) ou PubMed représentent des sources fécondes pour l’analyse et la reconstruction de cette évolution. Le projet de recherche proposé part des travaux de D. Chavalarias et J.-P. Cointet sur l’analyse de la dynamique de corpus évolutifs et la construction automatique de treillis «phylomémétiques» de topics (en analogie avec l'arbre généalogiques des espèces). Les outils actuels sont limités au traitement de corpus de taille moyenne et à une utilisation non interactive. Les contributions attendues se situent à la fois dans les domaines de l'Informatique et des Sciences humaines et sociales. Notre objectif est de développer des solutions performantes pour générer et interagir avec des cartes phylomémétiques qui exploitent les avancées technologiques récentes pour la parallélisation des tâches et des algorithmes sur des données complexes et volumineuses. Ces solutions sont conçues et et validées en collaboration avec des experts en philosophie et histoire des sciences sur différents corpus de grande taille. Mots clés: épistémologie quantitative, évolution des sciences, détéction de topics, alignement temporel, traitements de données à large échelle, sciences des données, big data, scientométrie

    Project leader : Bernd Amann
    10/01/2016 to 09/2021
    More details here …
  • EPIQUE - Reconstructing the long terms evolution of science through large scale analysis of science productions

    The evolution of scientific knowledge is directly related to the history of humanity. Document archives and bibliographic sources like the “Web Of Science” or PubMed contain a valuable source for the analysis and reconstruction of this evolution. The proposed project takes as starting point the contributions of D. Chavalarias and J.P. Cointet about the analysis of the dynamicity of evolutive corpora and the automatic construction of “phylomemetic” topic lattices (as an analogy with genealogic trees of natural species). Currently existing tools are limited to the processing of medium sized collections and a non interactive usage. The expected project outcome is situated at the crossroad between Computer science and Social sciences. Our goal is to develop new highly performant tools for building phylomemetic maps of science by exploiting recent technologies for parallelizing tasks and algorithms on complex and voluminous data. These tools are conceived and validated in collaboration with experts in philosophy and history of science over large scientific archives. Keywords : quantitative epistemology, evolution of science, topic detection, temporal alignment, big data processing, data science, big data

    Project leader : Bernd AMANN
    01/01/2017 to 06/2021
    More details here …
  • ROSES - Really Open, Simple and Efficient Syndication

    Social web portals (Twitter, Facebook), web blogs, online journals etc. produce continuous streams of text items (news, messages). User subscriptions to these streams can be modeled as queries which are continuously filtering and ranking new information items. A major challenge is then to efficiently process millions of such subscription queries. Initiated by the ROSES ANR project ROSES (2008-2012) on RSS feed aggregation and filtering we worked in on multi-query optimisation~(PhD J. Creus), on efficient refresh strategies for dynamic RSS feeds (PhD R. Horincar), and on continuous top-k query processing (PhD of N. Vouzoukidou ).

    Project leader : Bernd AMANN
    01/01/2008 to 12/2015
    More details here …

Team : LFI

  • AXA Research Funds JRI

    Interpretability for human-friendly machine learning models

    Project leader : Christophe Marsala
    03/01/2017 to 12/2023

Team : MOCAH

  • MacCoy Critical - Learning Critical situations with enriched Virtual Environment

    Ce projet vise à étudier et à améliorer les dispositifs de formation utilisant la simulation et les environnements virtuels dans deux domaines à forts enjeux en termes de santé publique et de sécurité (médecine et conduite automobile). Il vise plus particulièrement à approfondir (a) les connaissances sur les compétences non techniques de gestion des situations critiques et (b) les connaissances et les outils pour concevoir des environnements de simulation et de réalité virtuelle favorisant l’apprentissage. Afin de promouvoir des outils plus génériques, ce projet a choisi comme approche de travailler sur deux domaines d’apprentissage (médecine, conduite automobile).

    Project leader : Vanda LUENGO
    01/01/2015 to 12/2018
    More details here …
  • NeoPraEval - New tools for new practices in evaluating and teaching mathematics

    NeoPraEval is an ANR-funded research project which aims at providing tools for teachers to deal with heterogeneity in learning through automatic diagnostic and evaluation systems that they can use in their classes, as well as appropriated resources to answer the needs identified in students.

    Project leader : François BOUCHET
    02/01/2014 to 01/2018
    More details here …
  • https://www.lip6.fr/combien
    COMBIEN? - HOW MANY?

    The "COMBIEN?" software (meaning "HOW MANY?" in French) is a pedagogical system to help students learn combinatorics. The student learns to solve combinatorics problems using mathematical language and concepts. The system offers several interfaces, each corresponding to a class of problems and a solving method. Each interface offers a set of exercises. For each exercise, the student builds a solution, guided by an interface with appropriate tools and an incremental detection of errors. You can find the details on our site (only in French, sorry).

    Project leader : HĂ©lène GIROIRE
    01/01/1992 to 12/2016
    More details here …

Team : MoVe

  • PARDI

    PARameterized DIstributed Systems Verification

    Project leader : Pascal Poizat
    Partners : INPT/IRIT (leader), Inria/LORIA, UPS/LRI, UPMC/LIP6
    01/10/2016 to 12/2021
  • REVaMP

    Round -trip Engineering and Variability management platform and process french cluster

    Project leader : Tewfik Ziadi
    07/26/2016 to 12/2020
    More details here …
  • https://github.com/AdaCore/PolyORB
    PolyORB - PolyORB: A schizophrenic midleware

    PolyORB is a join project between LIP6 and Telecom Paris

    Project leader : Fabrice KORDON
    01/01/2001 to 01/2020
    More details here …
  • Monoge

    Atelier de Modélisation de Nouvelle Génération

    Project leader : Reda Bendraou
    03/18/2013 to 12/2017
  • MODELPLEX - MODELing solution for comPLEX software systems

    The European IST MODELPLEX project has three major objectives: * Objective A: Develop an open solution for complex systems engineering improving quality and productivity; * Objective B: Lead its industrialisation; * Objective C: Ensure its successful adoption by the industry.

    Project leader : Marie-Pierre GERVAIS
    09/01/2006 to 02/2010
    More details here …
  • Flex-eWare - Platform to producecustoized executio infrastructures

    This is an ANR funded project.

    Project leader : Fabrice KORDON
    02/01/2007 to 01/2010
  • ModFact

    ModFact is an Open Source project for the MDA(tm). More precisely, our goal is to provide a framework for building applications in accordance to the MDA(tm) concepts. ModFact is composed of several projects. The repository project leads to the generation of MOF repositories. The QVT project leads to the execution of model transformations. The model bus project leads to the interoperability of MDA related tools via the concept of MDA service.

    Project leader : Xavier BLANC
    09/01/2002 to 12/2005
    More details here …

Team : NPA

  • DataTweet - A public service for opportunistic communications

    The DataTweet project proposes to explore the idea of a ubiquitous public data service for transmitting short messages in a similar way to Twitter. Any user or source of the service will be able to send a short message at a very low rate to some destination address over various access networks: open 802.11 hotspots, base stations of LTE, car-to-infrastructure stations of 802.11p. The user or a device sends its message to the nearest wireless network at hand (for instance a 802.11 hotspot) that will forward it in a multihop way to the destination by the elements of the infrastructure network.

    Project leader : Marcelo DIAS DE AMORIM
    03/01/2013 to 09/2017
    More details here …
  • TCOC - TCP congestion online course

    TCP code for online course

    Project leader : Timur FRIEDMAN
    01/01/2013 to 12/2014
    More details here …

Team : QI

  • QUANGO

    The secure exchange of sensitive information is of crucial importance in our society, as it protects money transfer, commercial transactions, medical data, remote control of critical infrastructures (power grid, telecom etc.). On the security side, Quantum Key Distribution (QKD) is a protocol based on the laws of quantum mechanics that guarantees unconditional security in the sharing of random keys to be used for secure communication. On the communication side, 5G is the fifth generation of wireless communication, that will soon become the standard for virtually every major telecommunication service provider, as defined by the standards organization 3GPP (3rd Generation Partnership Project). Both QKD and 5G technologies require a satellite network for a global coverage of their services. QUANGO will provide a new paradigm to share the satellite infrastructure required for secure communication based on quantum technologies and for 5G communication.

    Project leader : Eleni Diamanti
    07/30/2020 to 12/2023
  • Thèse YAO MA Projet QENCLAVE - DIM SIRTEQ

    CNRS intermédiaire entre la Région (financeur) et SU pour le financement du projet PHD2 "QEnclave". 100 000€ alloués à l'embauche d'un doctorant +missions. Les frais de gestion seront pris sur le contrat de collaboration associé (C19/0811) conclu avec l'entreprise Veriqloud. recrutement de Yao MA - S19JRCN003

    Project leader : Elham Kashefi
    07/03/2019 to 02/2023