Résumé.
The decoding problem is fundamental in post-quantum cryptography. It can be broadly described as essentially solving a linear system with a non-linear constraint on the solution. Phrased this way, the problem applies to both code-based and lattice-based cryptography. For example, the linear system may be defined over FF_q, with the non-linear constraint being a condition on the Hamming weight of the solution (McEliece, BIKE, HQC, Wave, SDitH...) or even a condition on the subgroup in which the solution resides (CROSS). The system may also be defined over FF_q^m, with the non-linear constraint being a condition on the rank of the vector space spanned by the solution vectors (Mirath, Ryde). In lattice-based cryptography, the system is generally defined over ZZ_q, with the non-linear constraint being a condition on the Euclidean length of the solution (Kyber, Falcon, Dilithium, Hawk...). The list above is far from exhaustive and can extend to other areas such as Fully Homomorphic Encryption (FHE) or the blind code recognition problem (reverse of telecomunication).
Combinatorial techniques for solving the decoding problem (in both codes and lattices) can be classified into two categories: primal attacks and dual attacks. Primal attacks aim to recover a set of information about the solution, essentially a compressed description of the solution. Exhaustive search can then be accelerated using techniques like the Birthday Paradox (collision search). More recently, techniques based on nearest-neighbor searches have emerged and significantly sped up primal attacks. The nearest-neighbor problem involves finding close pairs in a list. The notion of closeness can be extended to adapt to the non-linear constraint of the original decoding problem. However, this requires adapting known techniques to these different variants of the nearest-neighbor problem.
It turns out that primal decoding techniques are often more effective when there are few equations in the system relative to the number of unknowns. Therefore, when the number of equations is close to the number of unknowns, is it possible to reduce the problem to another decoding problem where the number of equations is smaller? One approach to this is to dualize the decoding problem: instead of searching for a solution in a specific subspace, could we reduce the problem to finding a solution in the orthogonal/dual subspace? Some recent attacks, known as dual attacks, offer an initial attempt at such a reduction. However, dual attacks are a topic of controversy. In particular, the recent dual attack by Matzov, which claims to significantly lower the security level of Kyber, a lattice-based cryptosystem currently being standardized by NIST, has not been widely accepted. The analysis behind this attack relies on a set of assumptions that, in certain scenarios, have been shown to contradict established theorems or well-tested heuristics. Nonetheless, we present new techniques for analyzing dual attacks. Specifically, we avoid the same independence assumption made in Matzov’s work, allowing us to reassess the complexity of dual attacks on Kyber and demonstrate that its security levels fall below the NIST requirements.
Résumé.
On se situera à l’intersection des sciences de données et des sciences cognitives pour qualifier formellement les rhétoriques et représentations mobilisées dans les commentaires en ligne sur la vaccination. Le corpus (135 620 textes, 5 481 450 occurrences et 53 835 formes lexicales) fait d'abord l’objet d’une classification hiérarchique descendante (CDH). Pour l’interpréter, on mobilise un modèle de traitement sociocognitif qui caractérise l’orientation descriptive (valeur informationnelle) versus évaluative (valeur sociale) des commentaires et la mise en place de biais cognitifs. On a également recours à des modèles de langage (LLM) pour tenter de qualifier les classes lexicales et d'identifier les propos complotistes. Dans un deuxième temps, une approche plus expérimentale analyse la bande vocale de Hold-Up : Retour sur un chaos (2020) et effectue un merge de classes des deux CDH et une analyse des correspondances lexicales (AFC). On montre ainsi quatre formes argumentatives différentes dont l'une ouvre la possibilité d’une radicalisation complotiste.
Résumé.
Physical layer security aims to exploit the randomness of noisy channels in order to enhance security through coding and signal processing techniques. Unlike cryptography, it does not place any limitations on the adversary's computational power, but relies on an asymmetry in the channel quality between the legitimate users and the adversary. In this talk, we focus on the wiretap channel model, where a legitimate transmitter and receiver communicate in the presence of an eavesdropper who observes a degraded version of the receiver's outputs. For this model, secrecy can be measured in terms of mutual information leakage, or alternatively in terms of the average variational distance between output distributions corresponding to different confidential messages.
Motivated by IoT applications that require short packets or low latency, we focus on the performance of wiretap codes in finite blocklength. We consider a simple channel model where the main channel is noiseless and the eavesdropper’s channel is a binary erasure channel, and provide lower bounds for the achievable secrecy rates of polar and Reed-Muller codes. We show that under a total variation secrecy metric, Reed-Muller codes can achieve secrecy rates very close to the optimal second order coding rates.
Résumé.
NIST recently released a publication related to the transition to Post-Quantum Cryptography which specifies that most of the public key classical cryptosystems, especially RSA will be officially deprecated by 2030 and banned after 2035. In this talk, I will review the limits of the main cryptanalytical attacks on RSA, and present two new variants of RSA based on the cubic Pell curve.
Résumé.
Les cybercriminels, qu’il s’agisse de fraudeurs ou d’auteurs d’intimidations et de menaces en ligne, adaptent constamment leurs méthodes aux évolutions technologiques. Les environnements virtuels, ou « métavers », illustrent particulièrement bien cette dynamique en ouvrant la voie à de nouvelles formes d’escroqueries et de comportements malveillants. Ces espaces numériques combinant interactions sociales et transactions dématérialisées posent des défis inédits en matière de sécurité et d’investigation.
Cette présentation se propose d’analyser les enjeux liés aux environnements virtuels, en mettant l’accent sur l’exploitation des traces numériques pour identifier et contrer ces activités criminelles. Nous explorerons les opportunités offertes par l’analyse des données issues des casques de réalité virtuelle et des appareils connectés, ainsi que celles disponibles auprès des fournisseurs de services, des applications tierces et des plateformes de cryptomonnaies. Des cas concrets permettront d’illustrer les mécanismes sous-jacents aux fraudes et aux actes d’intimidation en ligne, tout en mettant en avant des pistes méthodologiques adaptées aux professionnels de l’investigation numérique et de la cybersécurité.