Projects APR [Archives]

Team : APR

  • MOPSA - Modular Open Platform for Static Analysis

    The Mopsa project aims at creating methods and tools to make computer software more reliable. Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. We wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software. We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core of Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.

    Project leader : Antoine MINÉ
    06/01/2016 to 05/2021
    More details here …
  • COVERIF - Combining abstract interpretation and constraint programming to verify critical properties of embedded programs with floating point computations

    Verifying correctness and robustness of programs and systems is a major challenge in a society which relies more and more on safety-critical systems controlled by embedded software. This issue is even more critical when the computations involve floating-point number arithmetic, an arithmetic known for its quite unusual behaviors, and which is increasingly used in embedded software. Note for example the "catastrophic cancellation" phenomenon where most of the significant digits of a result are cancelled or, numerical sequences whose limit is very different over the real numbers and over the floating-point numbers. A more important problem arises when we want to analyse the relationship between floating-point computations and an "idealized" computation that would be carried out with real numbers, the reference in the design of the program. The point is that for some input values, the control flow over the real numbers can go through one conditional branch while it goes through another one over the floating-point numbers. Certifying that a program, despite some control flow divergences, computes what it is actually expected to compute with a minimum error is the subject of the robustness or continuity analysis. Providing a set of techniques and tools for verifying the accuracy, correctness and robustness for critical embedded software is a major challenge. The aim of this project is to address this challenge by exploring new methods based on a tight collaboration between abstract interpretation (IA) and constraint programming (CP). In other words, the goal is to push the limits of these two techniques for improving accuracy analysis, to enable a more complete verification of programs using floating point computations, and thus, to make critical decisions more robust. The cornerstone of this project is the combination of the two approaches to increase the accuracy of the proof of robustness by using PPC techniques, and, where appropriate, to generate non-robust test cases. The goal is to benefit from the strengths of both techniques: PPC provides powerful but computationally expensive algorithms to reduce domains with an arbitrary given precision whereas AI does not provide fine control over domain precision, but has developed many abstract domains that quickly capture program invariants of various forms. Incorporating some PPC mechanisms (search tree, heuristics) in abstract domains would enable, in the presence of false alarms, to refine the abstract domain by using a better accuracy. The first problem to solve is to set the theoretical foundations of an analyser based on two substantially different paradigms. Once the interactions between PPC and IA are well formalized, the next issue is to handle constraints of general forms and potentially non-linear abstract domains. Last but not least, an important issue concerns the robustness analysis of more general systems than programs, like hybrid systems which are modeling control command programs. Research results will be evaluated on realistic benchmarks coming from industrial companies, in order to determine their benefits and relevance. For the explored approaches, using realistic examples is a key point since the proposed techniques often only behave in an acceptable manner on a given sub classes of problems (if we consider the worst-case computational complexity all these problems are intractable). That's why many solutions are closely connected to the target problems.

    Project leader : Eric GOUBAULT
    10/01/2015 to 09/2020
    More details here …
  • LCHIP - Low Cost High Integrity Platform

    Project LCHIP aims at largely facilitating the development of sure applications to high level of criticality while providing: - an environment of complete development allowing to generate and prove mathematically and automatically software with limited algorithmic, - a secure platform and at low cost for the execution of these application, in order to guarantee a maximum level of safety. Transparent integration with languages trade (DSL) and third line productions of code authorize a deployment without pain in existing development processes and support the exploitation of technology apart from the consortium.

    Project leader : Thierry LECOMTE
    10/01/2016 to 06/2020
    More details here …
  • GDRI-ALEA-NETWORK - ALEA-NETWORK

    ALEA-Network is a GDRI project funded by CNRS, and co-signed by 4 european Universities (Wien, Stockholm, Oxford, Munich). This project gathers researchers from Europe and some other countries all over the world, interested in the study of discrete random structures, and coming from various domains: Computer Science, Discrete Mathematics, Probability, Statistical Physics, Bio-Informatics. In this multidisciplinary context, the objective is to elaborate methods for quantifying alea and analyze statistical properties of fundamental combinatorial structures.

    Project leader : Michèle SORIA & Jean-François MARCKET
    01/01/2015 to 01/2019
    More details here …
  • ASSUME - Affordable Safe And Secure Mobility Evolution

    Future mobility solutions will increasingly rely on smart components that continuously monitor the environment and assume more and more responsibility for a convenient, safe and reliable operation. Currently the single most important roadblock for this market is the ability to come up with an affordable, safe multi-core development methodology that allows industry to deliver trustworthy new functions at competitive prices. ASSUME provides a seamless engineering methodology to overcome this roadblock. The problem is addressed on the constructive and on the analytic side. For efficient construction and synthesis of embedded systems, the project provides new tools, standards and methodologies to cover most of the challenges by design. In addition, ASSUME provides a well-integrated sound static analysis solution that allows proving the absence of problems even in a multi-core environment. New algorithms will be integrated in exploitable tools. New interoperability standards and requirements formalization standards will facilitate cooperation between different market players. The ASSUME consortium includes leading European industry partners for mobility solutions, tool and service providers for embedded system development as well as leading research institutes for static analysis for model-driven and traditional embedded systems development.

    Project leader : Udo GLEICH
    09/01/2015 to 08/2018
    More details here …
  • UCF - Ubiquitous Contents Framework

    Project leader : Jonathan RIVALAN
    10/01/2014 to 11/2017
    More details here …
  • MAGNUM - MĂ©thodes Algorithmiques pour la GĂ©nĂ©ration alĂ©atoire Non Uniforme: Modèles et applications

    The central theme of the MAGNUM project is the elaboration of complex discrete models that are of broad applicability in several areas of computer science. A major motivation for the development of such models is the design and analysis of efficient algorithms dedicated to simulation of large discrete systems and random generation of large combinatorial structures. Another important motivation is to revisit the area of average-case complexity theory under the angle of realistic data models. The project proposes to develop the general theory of complex discrete models, devise new algorithms for random generation and simulation, as well as bridge the gap between theoretical analyses and practically meaningful data models.

    Project leader : Michèle SORIA
    06/30/2010 to 06/2015
    More details here …
  • CERCLES2 - Compositional Certification of Critical and Safe embedded Softwares

    The CERCLES project is focused on the certification of safety critical embeded software. Its aim is to reduce the cost of this activity. The certification of safety critical embeded software is a costly industrial process. It has to follow a very long, rigourous and standardized process. For instance the DO178C standard of civil avionics describes the set of spécification (High Level Requirements), design (Architecture and Low Level Requirements), coding and verification activities that must be fulfilled to obtain the certification. The modern software development relies on assembling components. In such a case, the certification standard, in particular as testing is required in the verification activity, leads to cover twice the same way: a first time during the development process of the components and a second time when the coponents are reused in another system. The broad thrust of the CERCLES project is to capilalize on components certification in order to avoid the cost of covering twice the same way. The problemn is then the existence of methods trustful and tools for assembling components that are suitable for industry and capable to design system's architectures involving their certification. Programming languages have still proposed methods and tools (typing, objects, modules, contracts) to increase the safety of reusing software components. But they did not address the specific point of safety critical embeded software for which a verification activity that relies on testing (and its coverage measurement) is absolutly required. We must obtain that if two components A and B have been previously certified then we have a way to verify the assembly A-B. This entails that we are able to describe faithfully the units A and B as “components” offering some guaranted functionnal properties and some possibilty of reuse; the composition A-B as an assembly preserving the guaranties acquired by the components and realizing some other specified functionalities. On this basis we can expect to be abble to set both the preservation of correctness of the reused components individualy and the correctness of the composition with regards to its own requirements.

    Project leader : Philippe BAUFRETON
    04/01/2011 to 09/2014
    More details here …
  • PWD - Programming the Diffuse Web

    The goal of the PWD project is to propose new programming languages for developing new applications on the Web.

    Project leader : Manuel SERRANO
    11/01/2009 to 10/2013
    More details here …
  • OPENGPU - OPENGPU

    GPU are becoming a more and more promising solution to address the growing need in computing power for numerical applications, new conception methods or numerical simulations in relation with research works. Since 2005, the frequency raise in generalist CPU has led to the multiplication of cores rendering these solutions more difficult to use. Meanwhile, GPU have evolved in less specialized architectures making their use for calculation and data treatments feasible. This evolution came with the emergence of GPU’s unified architecture and, in 2008, by the finalization of the OpenCL standard offering an interesting perspective to exploit this architecture. The computing power and the power/energetic consumption ratio of the GPU being superior to standard CPU, the use of GPU represents an opportunity to have more power for a lower energetic cost. The Open GPU project allows to take advantage of this opportunity targeting a triple-goal: - Building up an integrated and open platform of Open Source tools in order to assist code parallelization - Testing the parallelization gains on industrial and academic applications - Designing the appropriate hardware and software architectures for the exploitation of these new computing powers and for the improvement of the energetic consumption The Open GPU project is composed of three complementary parts expected to synergistically feed each other.

    Project leader : Eric MAHE
    01/01/2010 to 01/2012
    More details here …
Mentions légales
Site map