- Computer Science Laboratory

Projects [Archives]

Team : ALMASTY

Cryptanalyse algébrique pour la cryptographie post-quantum

Project Leader : Charles Bouillaguet

11/01/2020 → 06/2025

https://anr.fr/Projet-ANR-20-ASTR-0011

https://anr.fr/Projet-ANR-20-ASTR-0011

AlmaCrypt - Algorithmic and Mathematical Cryptology

AlmaCrypt (Algorithmic and Mathematical Cryptology) is an ERC Advanced research project conducted by Antoine Joux from 2016 to 2022. It aimed at strengthening information security by challenging current assumptions in complexity theoretic cryptography through an advanced algorithmic analysis.

Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.

The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis. In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring.

We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency. We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.

Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.

In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.

Project Leader : Antoine Joux

01/01/2016 → 06/2022

https://www.almacrypt.eu

https://www.almacrypt.eu

Team : APR

MOPSA - Modular Open Platform for Static Analysis

MOPSA (Modular Open Platform for Static Analysis) is an ERC Consolidator research project conducted by Antoine Miné from 2016 to 2021. It aimed at developing an open-source platform for computer program verification based on the latest developments in abstract interpretation-based static program analysis. In short, MOPSA's goal is to create methods and tools to make computer software more reliable.

Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer I have coauthored, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. I wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software.

We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core on Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.

Project Leader : Antoine Mine

06/01/2016 → 03/2024

https://mopsa.lip6.fr

https://mopsa.lip6.fr

MOPSA - Modular Open Platform for Static Analysis

The Mopsa project aims at creating methods and tools to make computer software more reliable. Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. We wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software. We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core of Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.

Project Leader : Antoine MINÉ

06/01/2016 → 05/2021

https://www-apr.lip6.fr/~mine/

COVERIF - Combining abstract interpretation and constraint programming to verify critical properties of embedded programs with floating point computations

Verifying correctness and robustness of programs and systems is a major challenge in a society which relies more and more on safety-critical systems controlled by embedded software. This issue is even more critical when the computations involve floating-point number arithmetic, an arithmetic known for its quite unusual behaviors, and which is increasingly used in embedded software. Note for example the "catastrophic cancellation" phenomenon where most of the significant digits of a result are cancelled or, numerical sequences whose limit is very different over the real numbers and over the floating-point numbers. A more important problem arises when we want to analyse the relationship between floating-point computations and an "idealized" computation that would be carried out with real numbers, the reference in the design of the program. The point is that for some input values, the control flow over the real numbers can go through one conditional branch while it goes through another one over the floating-point numbers. Certifying that a program, despite some control flow divergences, computes what it is actually expected to compute with a minimum error is the subject of the robustness or continuity analysis. Providing a set of techniques and tools for verifying the accuracy, correctness and robustness for critical embedded software is a major challenge. The aim of this project is to address this challenge by exploring new methods based on a tight collaboration between abstract interpretation (IA) and constraint programming (CP). In other words, the goal is to push the limits of these two techniques for improving accuracy analysis, to enable a more complete verification of programs using floating point computations, and thus, to make critical decisions more robust. The cornerstone of this project is the combination of the two approaches to increase the accuracy of the proof of robustness by using PPC techniques, and, where appropriate, to generate non-robust test cases. The goal is to benefit from the strengths of both techniques: PPC provides powerful but computationally expensive algorithms to reduce domains with an arbitrary given precision whereas AI does not provide fine control over domain precision, but has developed many abstract domains that quickly capture program invariants of various forms. Incorporating some PPC mechanisms (search tree, heuristics) in abstract domains would enable, in the presence of false alarms, to refine the abstract domain by using a better accuracy. The first problem to solve is to set the theoretical foundations of an analyser based on two substantially different paradigms. Once the interactions between PPC and IA are well formalized, the next issue is to handle constraints of general forms and potentially non-linear abstract domains. Last but not least, an important issue concerns the robustness analysis of more general systems than programs, like hybrid systems which are modeling control command programs. Research results will be evaluated on realistic benchmarks coming from industrial companies, in order to determine their benefits and relevance. For the explored approaches, using realistic examples is a key point since the proposed techniques often only behave in an acceptable manner on a given sub classes of problems (if we consider the worst-case computational complexity all these problems are intractable). That's why many solutions are closely connected to the target problems.

Project Leader : Eric GOUBAULT

10/01/2015 → 09/2020

https://www.anr-coverif.fr/

LCHIP - Low Cost High Integrity Platform

Project LCHIP aims at largely facilitating the development of sure applications to high level of criticality while providing:

- an environment of complete development allowing to generate and prove mathematically and automatically software with limited algorithmic,

- a secure platform and at low cost for the execution of these application, in order to guarantee a maximum level of safety.

Transparent integration with languages trade (DSL) and third line productions of code authorize a deployment without pain in existing development processes and support the exploitation of technology apart from the consortium.

Project Leader : Thierry LECOMTE

10/01/2016 → 06/2020

http://www.clearsy.com/2016/10/4260/

GDRI-ALEA-NETWORK - ALEA-NETWORK

ALEA-Network is a GDRI project funded by CNRS, and co-signed by 4 european Universities (Wien, Stockholm, Oxford, Munich). This project gathers researchers from Europe and some other countries all over the world, interested in the study of discrete random structures, and coming from various domains: Computer Science, Discrete Mathematics, Probability, Statistical Physics, Bio-Informatics. In this multidisciplinary context, the objective is to elaborate methods for quantifying alea and analyze statistical properties of fundamental combinatorial structures.

Project Leader : Michèle SORIA & Jean-François MARCKET

01/01/2015 → 01/2019

http://aleanetwork.net/

ASSUME - Affordable Safe And Secure Mobility Evolution

Future mobility solutions will increasingly rely on smart components that continuously monitor the environment and assume more and more responsibility for a convenient, safe and reliable operation. Currently the single most important roadblock for this market is the ability to come up with an affordable, safe multi-core development methodology that allows industry to deliver trustworthy new functions at competitive prices. ASSUME provides a seamless engineering methodology to overcome this roadblock. The problem is addressed on the constructive and on the analytic side. For efficient construction and synthesis of embedded systems, the project provides new tools, standards and methodologies to cover most of the challenges by design. In addition, ASSUME provides a well-integrated sound static analysis solution that allows proving the absence of problems even in a multi-core environment. New algorithms will be integrated in exploitable tools. New interoperability standards and requirements formalization standards will facilitate cooperation between different market players. The ASSUME consortium includes leading European industry partners for mobility solutions, tool and service providers for embedded system development as well as leading research institutes for static analysis for model-driven and traditional embedded systems development.

Project Leader : Udo GLEICH

09/01/2015 → 08/2018

http://assume-project.eu/

UCF - Ubiquitous Contents Framework

Project Leader : Jonathan RIVALAN

10/01/2014 → 11/2017

http://www.ubiquitus-content-framework.fr/

MAGNUM - Méthodes Algorithmiques pour la Génération aléatoire Non Uniforme: Modèles et applications

The central theme of the MAGNUM project is the elaboration of complex discrete models that are of broad applicability in several areas of computer science. A major motivation for the development of such models is the design and analysis of efficient algorithms dedicated to simulation of large discrete systems and random generation of large combinatorial structures.

Another important motivation is to revisit the area of average-case complexity theory under the angle of realistic data models. The project proposes to develop the general theory of complex discrete models, devise new algorithms for random generation and simulation, as well as bridge the gap between theoretical analyses and practically meaningful data models.

Project Leader : Michèle SORIA

06/30/2010 → 06/2015

http://www.lix.polytechnique.fr/~rossin/ANR/Magnum/www/

CERCLES2 - Compositional Certification of Critical and Safe embedded Softwares

The CERCLES project is focused on the certification of safety critical embeded software. Its aim is to reduce the cost of this activity.

The certification of safety critical embeded software is a costly industrial process. It has to follow a very long, rigourous and standardized process. For instance the DO178C standard of civil avionics describes the set of spécification (High Level Requirements), design (Architecture and Low Level Requirements), coding and verification activities that must be fulfilled to obtain the certification.

The modern software development relies on assembling components. In such a case, the certification standard, in particular as testing is required in the verification activity, leads to cover twice the same way: a first time during the development process of the components and a second time when the coponents are reused in another system.

The broad thrust of the CERCLES project is to capilalize on components certification in order to avoid the cost of covering twice the same way. The problemn is then the existence of methods trustful and tools for assembling components that are suitable for industry and capable to design system's architectures involving their certification.

Programming languages have still proposed methods and tools (typing, objects, modules, contracts) to increase the safety of reusing software components. But they did not address the specific point of safety critical embeded software for which a verification activity that relies on testing (and its coverage measurement) is absolutly required. We must obtain that if two components A and B have been previously certified then we have a way to verify the assembly A-B. This entails that we are able to describe faithfully

the units A and B as “components” offering some guaranted functionnal properties and some possibilty of reuse;

the composition A-B as an assembly preserving the guaranties acquired by the components and realizing some other specified functionalities.

On this basis we can expect to be abble to set both the preservation of correctness of the reused components individualy and the correctness of the composition with regards to its own requirements.

Project Leader : Philippe BAUFRETON

04/01/2011 → 09/2014

http://www.algo-prog.info/cercles

PWD - Programming the Diffuse Web

The goal of the PWD project is to propose new programming languages for developing new applications on the Web.

Project Leader : Manuel SERRANO

11/01/2009 → 10/2013

http://www-sop.inria.fr/indes/pwd/

OPENGPU - OPENGPU

GPU are becoming a more and more promising solution to address the growing need in computing power for numerical applications, new conception methods or numerical simulations in relation with research works. Since 2005, the frequency raise in generalist CPU has led to the multiplication of cores rendering these solutions more difficult to use. Meanwhile, GPU have evolved in less specialized architectures making their use for calculation and data treatments feasible. This evolution came with the emergence of GPU’s unified architecture and, in 2008, by the finalization of the OpenCL standard offering an interesting perspective to exploit this architecture. The computing power and the power/energetic consumption ratio of the GPU being superior to standard CPU, the use of GPU represents an opportunity to have more power for a lower energetic cost.

The Open GPU project allows to take advantage of this opportunity targeting a triple-goal:

- Building up an integrated and open platform of Open Source tools in order to assist code parallelization

- Testing the parallelization gains on industrial and academic applications

- Designing the appropriate hardware and software architectures for the exploitation of these new computing powers and for the improvement of the energetic consumption

The Open GPU project is composed of three complementary parts expected to synergistically feed each other.

Project Leader : Eric MAHE

01/01/2010 → 01/2012

http://opengpu.net/

Team : BD

ANR EPIQUE

Vers une épistémologie quantitative - Reconstruire l'évolution des sciences à grande échelle

L'évolution des connaissances scientifiques est directement liée à l'histoire de l'humanité. Les archives documentaires et bibliographiques comme le « Web Of Science » (WoS) ou PubMed représentent des sources fécondes pour l’analyse et la reconstruction de cette évolution. Le projet de recherche proposé part des travaux de D. Chavalarias et J.-P. Cointet sur l’analyse de la dynamique de corpus évolutifs et la construction automatique de treillis «phylomémétiques» de topics (en analogie avec l'arbre généalogiques des espèces). Les outils actuels sont limités au traitement de corpus de taille moyenne et à une utilisation non interactive. Les contributions attendues se situent à la fois dans les domaines de l'Informatique et des Sciences humaines et sociales. Notre objectif est de développer des solutions performantes pour générer et interagir avec des cartes phylomémétiques qui exploitent les avancées technologiques récentes pour la parallélisation des tâches et des algorithmes sur des données complexes et volumineuses. Ces solutions sont conçues et et validées en collaboration avec des experts en philosophie et histoire des sciences sur différents corpus de grande taille.

Mots clés: épistémologie quantitative, évolution des sciences, détéction de topics, alignement temporel, traitements de données à large échelle, sciences des données, big data, scientométrie

Project Leader : Bernd Amann

10/01/2016 → 09/2021

http://www-bd.lip6.fr/wiki/site/recherche/projets/epique

EPIQUE - Reconstructing the long terms evolution of science through large scale analysis of science productions

The evolution of scientific knowledge is directly related to the history of humanity. Document archives and bibliographic sources like the “Web Of Science” or PubMed contain a valuable source for the analysis and reconstruction of this evolution. The proposed project takes as starting point the contributions of D. Chavalarias and J.P. Cointet about the analysis of the dynamicity of evolutive corpora and the automatic construction of “phylomemetic” topic lattices (as an analogy with genealogic trees of natural species). Currently existing tools are limited to the processing of medium sized collections and a non interactive usage. The expected project outcome is situated at the crossroad between Computer science and Social sciences. Our goal is to develop new highly performant tools for building phylomemetic maps of science by exploiting recent technologies for parallelizing tasks and algorithms on complex and voluminous data. These tools are conceived and validated in collaboration with experts in philosophy and history of science over large scientific archives.

Keywords : quantitative epistemology, evolution of science, topic detection, temporal alignment, big data processing, data science, big data

Project Leader : Bernd AMANN

01/01/2017 → 06/2021

http://www-bd.lip6.fr/wiki/site/recherche/projets/epique/start

ROSES - Really Open, Simple and Efficient Syndication

Social web portals (Twitter, Facebook), web blogs, online journals etc. produce continuous streams of text items (news, messages). User subscriptions to these streams can be modeled as queries which are continuously filtering and ranking new information items. A major challenge is then to efficiently process millions of such subscription queries. Initiated by the ROSES ANR project ROSES (2008-2012) on RSS feed aggregation and filtering we worked in on multi-query optimisation~(PhD J. Creus), on efficient refresh strategies for dynamic RSS feeds (PhD R. Horincar), and on continuous top-k query processing (PhD of N. Vouzoukidou ).

Project Leader : Bernd AMANN

01/01/2008 → 12/2015

http://www-bd.lip6.fr/wiki/roses/start

Team : DECISION

Lemon - library for easily modeling and operating on networks

Lemon is a GUI toolkit written in C++ for manipulating graphical models. It relies on aGrUM both for graph theoretic algorithms and for computations within the graphical models (e.g., inference, learning).

Project Leader : Christophe GONZALES

→ 12/2021

http://www-desir.lip6.fr/~gonzales/research/lemon

http://www-desir.lip6.fr/~gonzales/research/lemon

Team : LFI

IFP-in-RL - Interpretable-by-design Fuzzy Policy in Reinforcement Learning

In the general context of the field of eXplainable Artificial Intelligence (XAI), the IFP-in-RL project aims to propose a method for the automatic construction of a control system of a system, such as a drone, which takes take into account the interpretability constraint in its very design. This project takes place within the framework of systems based on fuzzy rules which, since their introduction, aim to facilitate the expression of knowledge in a linguistic form, natural for the user, and easily understandable by a human. Such a knowledge representation is an excellent way to promote human interaction with the computer system and to improve their understanding of how it works, thus offering the possibility of making their behavior transparent and easily validated. In the literature, different approaches to build or to fine-tune a fuzzy rule base to design a system exist, but they generally suffer from the drawback of not incorporating specific interpretability optimization.

In this project, an innovative methodology is introduced for the design of such systems. This methodology is based on the implementation of a reinforcement learning approach using interpretability metrics. The objective here is to integrate the consideration and optimization of the desired interpretability during the learning itself, and not a posteriori as many methods currently do in the field of XAI.

The IFP-in-RL project aims to achieve this upstream, a complete study, both theoretical and experimental, of interpretability metrics, including existing numerical criteria as well as user needs. This will involve proposing a taxonomy of existing metrics and defining new measures if necessary, in order to complete the previous ones and allow their exploitation in original reinforcement learning algorithms. An original feature of this project is to integrate a qualitative assessment, carried out on a human panel, of the proposed metrics but also of the rule bases obtained at the end of reinforcement learning.

In application terms, the objective of the IFP-in-RL project is to implement these proposals for piloting a drone, navigating in complete autonomy to ensure a mission consisting of flying over points of interest and taking pictures, from data provided by a simulator.

Project Leader : Christophe MARSALA

01/01/2023 → 06/2025

https://anr.fr/Projet-ANR-22-ASTR-0032

AXA Research Funds JRI

Interpretability for human-friendly machine learning models

Project Leader : Christophe Marsala

03/01/2017 → 12/2023

Team : MOCAH

Hubble - HUman oBservatory Based on anaLysis of e-LEarning traces

The objective of the Hubble ANR project is to create a national observatory for the building and sharing of massive data analysis processes, from traces left in e-learning environments. Hubble will enable to analyze and explain phenomena of teaching and learning with these environments.

Project Leader : Vanda LUENGO et François BOUCHEY

01/01/2015 → 01/2025

http://hubblelearn.imag.fr/?lang=fr

MindMath - MindMath

MindMath (financé par la BPI et la région IdF) est un projet de plateforme gamifiée et adaptative pour l’apprentissage des mathématiques au collège. Ce projet implique plusieurs partenaires industriels (Cabrilog, Tralalere, Domoscio et Bayard) et académiques (LDAR - Université Paris Diderot et LIP6 - Sorbonne Université).

Nous développons des algorithmes pour décider, en fonction des activités des élèves au sein de tâches de résolution de problèmes en mathématiques, du feedback le plus adapté pour les aider à progresser dans leurs apprentissages. La décision s’appuie à la fois sur une ontologie construite avec des experts en didactique des mathématiques et sur des approches d’apprentissage automatique. La recherche des feedbacks optimaux se fait par apprentissage par renforcement, avec un système de récompense basé sur la réussite des élèves dans les activités. Ces propositions sont expérimentées dans différents contextes scolaires et parascolaires.

Project Leader : Amel Yessad

01/01/2019 → 12/2024

https://www.mindmath.education

IECARE - IECARE

IECARE is a research project funded by the French National Research Agency (ANR). Its aim is to produce fundamental and operational knowledge on computer science, its teaching and learning, in compulsory schools. This multi-disciplinary project brings together researchers in computer science and the humanities and social sciences (educational sciences, didactics, learning psychology, sociology). The research follows three structuring themes: analyzing the representations and practices of teachers and students; modeling, designing and modifying pedagogical scenarios and resources to support teaching and learning practices; studying the support frameworks set up by and for IT teachers and trainers.

Project Leader : Mathieu Muratet

01/01/2019 → 12/2024

https://iecare.lip6.fr/

RecoMOOC - Recommending people to people in MOOCs

RecoMOOC aims at reducing attrition and improving the learning experience for MOOC learners through interaction with other students they can talk to and work with, identified automatically through a peer recommender system analyzing each learner's individual factors and progression.

Project Leader : François BOUCHET

01/01/2014 → 12/2020

MacCoy Critical - Learning Critical situations with enriched Virtual Environment

Ce projet vise à étudier et à améliorer les dispositifs de formation utilisant la simulation et les environnements virtuels dans deux domaines à forts enjeux en termes de santé publique et de sécurité (médecine et conduite automobile). Il vise plus particulièrement à approfondir (a) les connaissances sur les compétences non techniques de gestion des situations critiques et (b) les connaissances et les outils pour concevoir des environnements de simulation et de réalité virtuelle favorisant l’apprentissage. Afin de promouvoir des outils plus génériques, ce projet a choisi comme approche de travailler sur deux domaines d’apprentissage (médecine, conduite automobile).

Project Leader : Vanda LUENGO

01/01/2015 → 12/2018

http://anrmaccoy.fr/

NeoPraEval - New tools for new practices in evaluating and teaching mathematics

NeoPraEval is an ANR-funded research project which aims at providing tools for teachers to deal with heterogeneity in learning through automatic diagnostic and evaluation systems that they can use in their classes, as well as appropriated resources to answer the needs identified in students.

Project Leader : François BOUCHET

02/01/2014 → 01/2018

http://eda.recherche.parisdescartes.fr/projet-anr-neopraeval/

COMBIEN? - HOW MANY?

The "COMBIEN?" software (meaning "HOW MANY?" in French) is a pedagogical system to help students learn combinatorics. The student learns to solve combinatorics problems using mathematical language and concepts. The system offers several interfaces, each corresponding to a class of problems and a solving method. Each interface offers a set of exercises. For each exercise, the student builds a solution, guided by an interface with appropriate tools and an incremental detection of errors. You can find the details on our site (only in French, sorry).

Project Leader : HĂ©lène GIROIRE

01/01/1992 → 12/2016

https://www.lip6.fr/combien

https://www.lip6.fr/combien

Team : MoVe

PARDI

PARameterized DIstributed Systems Verification

Project Leader : Pascal Poizat

Partners : INPT/IRIT (leader), Inria/LORIA, UPS/LRI, UPMC/LIP6

01/10/2016 → 12/2021

REVaMP

Round -trip Engineering and Variability management platform and process french cluster

Project Leader : Tewfik Ziadi

07/26/2016 → 12/2020

https://itea3.org/project/revamp2.html

PolyORB - PolyORB: A schizophrenic midleware

PolyORB is a join project between LIP6 and Telecom Paris

Project Leader : Fabrice KORDON

01/01/2001 → 01/2020

https://github.com/AdaCore/PolyORB

https://github.com/AdaCore/PolyORB

Monoge

Atelier de Modélisation de Nouvelle Génération

Project Leader : Reda Bendraou

03/18/2013 → 12/2017

MODELPLEX - MODELing solution for comPLEX software systems

The European IST MODELPLEX project has three major objectives: * Objective A: Develop an open solution for complex systems engineering improving quality and productivity; * Objective B: Lead its industrialisation; * Objective C: Ensure its successful adoption by the industry.

Project Leader : Marie-Pierre GERVAIS

09/01/2006 → 02/2010

https://www.modelplex.org/

Flex-eWare - Platform to producecustoized executio infrastructures

This is an ANR funded project.

Project Leader : Fabrice KORDON

02/01/2007 → 01/2010

ModFact

ModFact is an Open Source project for the MDA(tm). More precisely, our goal is to provide a framework for building applications in accordance to the MDA(tm) concepts. ModFact is composed of several projects. The repository project leads to the generation of MOF repositories. The QVT project leads to the execution of model transformations. The model bus project leads to the interoperability of MDA related tools via the concept of MDA service.

Project Leader : Xavier BLANC

09/01/2002 → 12/2005

http://modfact.lip6.fr

Team : NPA

DataTweet - A public service for opportunistic communications

The DataTweet project proposes to explore the idea of a ubiquitous public data service for transmitting short messages in a similar way to Twitter. Any user or source of the service will be able to send a short message at a very low rate to some destination address over various access networks: open 802.11 hotspots, base stations of LTE, car-to-infrastructure stations of 802.11p. The user or a device sends its message to the nearest wireless network at hand (for instance a 802.11 hotspot) that will forward it in a multihop way to the destination by the elements of the infrastructure network.

Project Leader : Marcelo DIAS DE AMORIM

03/01/2013 → 09/2017

https://datatweet.imag.fr

TCOC - TCP congestion online course

TCP code for online course

Project Leader : Timur FRIEDMAN

01/01/2013 → 12/2014

http://www.forgebox.eu/fb/preview_course.php?course_id=60

Team : PolSys

Singularités et stabilité des asservissements référencés capteurs

Project Leader : Jean-Charles Faugere

10/01/2018 → 06/2025

https://anr.fr/Projet-ANR-18-CE33-0011

https://anr.fr/Projet-ANR-18-CE33-0011

Team : QI

Applications and Hardware for Photonic Quantum Information Processing

AppQInfo will provide a world class training in photonic Quantum Information Processing (pQIP), and prepare an excellent cohort of Early Stage Researchers (ESRs) to become the future R&D staff of Europe’s emerging markets in this area. Quantum Information Processing (QIP) is a key ingredient in Europe’s future Quantum Communication Infrastructure; it underpins quantum communications and quantum simulations, the first two pillars of the H2020 Quantum Flagship. QIP will revolutionise information technology, providing higher quality, speed and unconditional security, not possible with classical technologies. AppQInfo focusses on QIP in state-of-the-art integrated photonics, one of five Key Enabling Technologies for European Industry. Integrated photonics permits the creation, manipulation and readout of photonic quantum states in a highly controlled manner, with high speeds and low losses. The broad objectives of AppQInfo are: to create an excellent training of ESRs in the field of pQIP that is both interdisciplinary and intersectoral; to develop innovative, entrepreneurial ESRs with great career prospects; to maximise the exploitation and dissemination of our research; to engage the public through several outreach activities; to consolidate a wide expertise in the field of pQIP; to create a long-lasting collaboration network of top-class research units and industrial entities. Together, our 15 interdisciplinary research projects will work towards feasible long- distance quantum communications from urban-scale networks to satellite-based systems using various data encoding; study quantum photonic circuits towards their quantum transport properties and quantum transforms they implement; exploit these platforms for machine learning applications, such as building all-optical artificial neural networks, and applying them for quantum simulations; develop enabling technologies of sources and detectors of multiphoton quantum states and polaritonic logic gates.

Project Leader : Eleni Diamanti

06/10/2021 → 02/2025

PHOQUSING - Photonics Quantum Sampling Machine - H2020

Randomness is a resource that enables applications such as efficient probabilistic algorithms, numerical integration, simulation, and optimization. In the last few years it was realized that quantum devices can generate probability distributions that are inaccessible with classical means. Hybrid Quantum Computational models combine classical processing with these quantum sampling machines to obtain computational advantage in some tasks. Moreover, NISQ (Noisy, Intermediate-Scale Quantum) technology may suffice to obtain this advantage in the near term, long before we can build large-scale, universal quantum computers. PHOQUSING aims to implement PHOtonic Quantum SamplING machines based on large, reconfigurable interferometers with active feedback, and state-of-the-art photon sources based both on quantum dots and parametric down-conversion. We will overview the different architectures enabling the generation of these hard-to-sample distributions using integrated photonics, optimizing the designs and studying the tolerance to errors. We will build two quantum sampling machines with different technologies, as a way to do cross-checks while exploiting all advantages of each platform. These machines will establish a new state-of-the-art in photonic reconfigurability, system complexity, and integration. Finally, we plan to perform first, proof-of-principle demonstrations of Hybrid Quantum Computation applications in optimization, machine learning, and graph theory. The PHOQUSING team includes long-term scientific collaborators who were among the first to demonstrate quantum photonic samplers; two of the leading European start-ups in the relevant quantum technologies; and theoretical experts in photonics and quantum information science. This project will help establish photonics as a leading new quantum computational technology in Europe, addressing the science-to-technology transition towards a new industrial sector with a large foreseeable economic impact.

Project Leader : Elham Kashefi

09/01/2020 → 08/2024

QUSCO - Quantum Superiority with Coherent State

QUSCO (Quantum Superiority with Coherent State) is an ERC Starting research project conducted by Eleni Diamanti from 2018 to 2024. It aimed at developing and experimentally demonstrating a framework where current quantum technologies can be used to outperform their classical counterparts in the field of information communications.

Project Leader : Eleni Diamanti

01/01/2018 → 06/2024

https://cordis.europa.eu/project/id/758911/fr

https://cordis.europa.eu/project/id/758911/fr

QUANGO

The secure exchange of sensitive information is of crucial importance in our society, as it protects money transfer, commercial transactions, medical data, remote control of critical infrastructures (power grid, telecom etc.). On the security side, Quantum Key Distribution (QKD) is a protocol based on the laws of quantum mechanics that guarantees unconditional security in the sharing of random keys to be used for secure communication. On the communication side, 5G is the fifth generation of wireless communication, that will soon become the standard for virtually every major telecommunication service provider, as defined by the standards organization 3GPP (3rd Generation Partnership Project). Both QKD and 5G technologies require a satellite network for a global coverage of their services. QUANGO will provide a new paradigm to share the satellite infrastructure required for secure communication based on quantum technologies and for 5G communication.

Project Leader : Eleni Diamanti

07/30/2020 → 12/2023

Thèse YAO MA Projet QENCLAVE - DIM SIRTEQ

CNRS intermédiaire entre la Région (financeur) et SU pour le financement du projet PHD2 "QEnclave". 100 000€ alloués à l'embauche d'un doctorant +missions. Les frais de gestion seront pris sur le contrat de collaboration associé (C19/0811) conclu avec l'entreprise Veriqloud.

recrutement de Yao MA - S19JRCN003

Project Leader : Elham Kashefi

07/03/2019 → 02/2023

Team : SMA

Flow Machines

Flow Machines is an ERC Advanced research project conducted by François Pachet from 2012 to 2017, in partnership with Sony Europe. It aimed at developing cutting-edge algorithms for AI-assisted music composition. In 2014, it was granted a subsequent ERC Proof of Concept grant that led to the development of the Flow Machines software for public use.

Results:

Combinatorial Optimization and Music - I pioneered the following domains:

1.1. Fast Automatic Harmonization With my Ph.D. student Pierre Roy (now researcher at Sony), I developed a new approach to solving combinatorial problems on complex objects, leading to the fastest automatic harmonization system at the time (best technical paper award at the Conf. of the Brit. Comp. Society Group on Expert Systems; 1 patent).

1.2. Interactive Mixing With my Ph.D. student O. Delerue (IRCAM), I invented interactive mixing. We showed that constraint propagation techniques can faithfully represent mixing knowledge from sound engineers. “The demonstration of MusicSpace was the best Computer Music demo I had seen” said G. Assayag (IRCAM). MusicSpace is now exhibited at the Sony Science Museum in Tokyo (2 patents granted).

1.3. Playlist Generation I was the first to introduce automatic playlist generation, which became a subfield of MIR in its own right. To feed these systems I established links with Sony Music and produced the first reasoned ontology of musical genres (Pachet & Cazaly, 00), now widely cited.

1.4. Concatenative Synthesis With my student A. Zils, I contributed to audio synthesis by introducing the Musaicing method (Zils & Pachet, 01).

1.5. Controllable Sequence Generation With P. Roy I solved the difficult problem of Markov chain control by formulating Markov chains as a combinatorial optimization problem (Pachet & Roy, 11) (1 patent). This breakthrough will be further elaborated during the project.

Project Leader : francois pachet

08/01/2012 → 07/2017

https://www.flow-machines.com

https://www.flow-machines.com