Sorry, you need to enable JavaScript to visit this website.
Partager

Publications

 

Les publications de nos enseignants-chercheurs sont sur la plateforme HAL :

 

Les publications des thèses des docteurs du LTCI sont sur la plateforme HAL :

 

Retrouver les publications figurant dans l'archive ouverte HAL par année :

2025

  • Efficient thermalization and universal quantum computing with quantum Gibbs samplers
    • Rouzé Cambyse
    • Stilck Franca Daniel
    • Alhambra Álvaro
    , 2024. The preparation of thermal states of matter is a crucial task in quantum simulation. In this work, we prove that a recently introduced, efficiently implementable dissipative evolution thermalizes to the Gibbs state in time scaling polynomially with system size at high enough temperatures for any Hamiltonian that satisfies a Lieb-Robinson bound, such as local Hamiltonians on a lattice. Furthermore, we show the efficient adiabatic preparation of the associated purifications or "thermofield double" states. To the best of our knowledge, these are the first results rigorously establishing the efficient preparation of high-temperature Gibbs states and their purifications. In the low-temperature regime, we show that implementing this family of dissipative evolutions for inverse temperatures polynomial in the system's size is computationally equivalent to standard quantum computations. On a technical level, for high temperatures, our proof makes use of the mapping of the generator of the evolution into a Hamiltonian, and then connecting its convergence to that of the infinite temperature limit. For low temperature, we instead perform a perturbation at zero temperature and resort to circuit-to-Hamiltonian mappings akin to the proof of universality of quantum adiabatic computing. Taken together, our results show that a family of quasi-local dissipative evolutions efficiently prepares a large class of quantum many-body states of interest, and has the potential to mirror the success of classical Monte Carlo methods for quantum many-body systems.
  • Extending InSAR2InSAR to Sentinel-1 Data
    • Geara Carla
    • Gelas Colette
    • De Vitry Louis
    • Colin Elise
    • Tupin Florence
    IEEE Geoscience and Remote Sensing Letters, IEEE - Institute of Electrical and Electronics Engineers, 2025. Interferometric SAR parameters estimation is a very important and challenging problem. The InSAR2InSAR method previously proposed is one of the few self-supervised methods that aims to estimate InSAR parameters. This method has proven to outperform state-of-the-art methods on simulated synthetic data. However, it has to be extended on real data. In this letter, we demonstrate that Sentinel-1 images acquired in the Interferometric Wide Swath mode possess the necessary properties to train and apply InSAR2InSAR effectively. In this paper, we demonstrate the ability of InSAR2InSAR to process across-track Sentinel-1 interferometric images with state-of-theart performances.
  • Small Yet Configurable: Unveiling Null Variability in Software
    • Tërnava Xhevahire
    • Randrianaina Georges Aaron
    • Lesoil Luc
    • Acher Mathieu
    , 2025. Many small-scale software systems, that is, with limited codebase or binary size, are widely used in everyday tasks, yet their configurability remains largely unexplored. At the same time, studies on modern software systems show a trend toward increasing configurability, alongside growing interest in building immutable, specialized, and reproducible software. In this paper, we present the first empirical study on the extent of configurability in small-scale software systems. By analyzing 108 programs from GNU coreutils, we show that even small programs can exhibit significant compile-time and run-time variability, with up to 76 options per program. Then, there is a high correlation (0.78) between run-time variability and codebase size. Furthermore, an analysis of the 20 smallest programs across 85 releases reveals that variability tends to increase over time, primarily due to the added compile-time variability. This suggests that shifting options between run-time and compile-time, removing unnecessary run-time variability, or resolving compile-time variability early, can help reduce codebase complexity and size. We also introduce, for the first time, the concept of null-variable software system, one with no configurability beyond mandatory features. Our findings show that high configurability is not exclusive to largescale systems and that reducing unnecessary variability can lead to lightweight, smaller, and more maintainable software. We hope this effort contributes to designing new software by understanding how to balance its configurability with codebase size.
  • Réélaboration des règles de sécurité en contextes interpersonnels : le cas du toucher en temps de Covid‑19
    • Héron Robin
    • Safin Stéphane
    • Baker Michael J
    • Zhang Zhuoming
    • Alvina Jessalyn
    • Lecolinet Éric
    • Détienne Françoise
    Activités, Association Recherches et Pratiques sur les ACTivités, 2025, 22-1. Dans cet article, nous étudions comment les règles de sécurité sanitaire liées au toucher social, établies dans la situation de pandémie, ont été réélaborées dans les interactions interpersonnelles. Selon le cadre de la sécurité adaptative, les règles sont vues comme des ressources adaptées en fonction du contexte. Pour mieux comprendre les processus de réélaboration des règles de sécurité dans des contextes interpersonnels, nous avons mené une étude comprenant un questionnaire en ligne sur les habitudes de toucher social avant et après le confinement en Europe, suivi d’entretiens approfondis auprès d’une sélection de participants vivant en France. Nos résultats mettent en évidence (1) des pratiques de toucher réduites, en particulier pour les relations semi-intimes telles que les collègues, les amis, les amis proches et la famille élargie ; (2) deux processus de réélaboration des règles de sécurité liées à la pandémie, délibération explicite et alignement comportemental, ainsi que des processus réflexifs ; et (3) deux dimensions des justifications données pour la réélaboration de ces règles, la préservation de la santé physique/la crainte de la vulnérabilité et le maintien des relations sociales, ainsi que leur poids relatif selon les relations (famille versus amis) entre les interactants. Alors que les règles de santé et de sécurité recommandées ont été majoritairement suivies, entraînant une forte diminution des comportements de toucher, il apparaît que les personnes réélaborent les règles en fonction de la relation qu’elles entretiennent. En réélaborant les règles officielles de santé et de sécurité, les personnes trouvent un équilibre entre les mesures de sécurité sanitaire prescrites et les situations d’interaction spécifiques afin de préserver leur santé et/ou la relation. (10.4000/13ra9)
    DOI : 10.4000/13ra9
  • Functional analysis of multivariate max-stable distributions
    • Costacèque-Cecchi Bruno
    • Decreusefond Laurent
    , 2025. <div><p>We study the connections existing between max-infinitely divisible distributions and Poisson processes from the point of view of functional analysis. More precisely, we derive functional identities for the former by using well-known results of Poisson stochastic analysis. We also introduce a family of Markov semigroups whose stationary measures are the so-called multivariate max-stable distributions. Their generators thus provide a functional characterization of extreme valued distributions in any dimension. Additionally, we give a few functional identities associated to those semi-groups, namely a Poincaré identity and commutation relations. Finally, we present a stochastic process whose semigroup corresponds to the one we introduced and that can be expressed using extremal stochastic integrals.</p></div>
  • Invertibility of functionals of the Poisson process and applications
    • Coutin Laure
    • Decreusefond Laurent
    The Annals of Probability, Institute of Mathematical Statistics, 2025, 53 (5). Following previous investigations by Üstünel [22] about the invertibility of some transformations on the Wiener space, we find some entropic conditions under which a random change of time is invertible on the Poisson space. As a consequence, we provide a new construction of Hawkes processes. We also establish a new variational representation of the entropy. (10.1214/24-AOP1748)
    DOI : 10.1214/24-AOP1748
  • Joint despeckling and thermal noise compensation: application to Sentinel-1 images of the Arctic
    • Meraoumia Inès
    • Ratha Debanshu
    • Dalsasso Emanuele
    • Lohse Johannes
    • Tupin Florence
    • Marinoni Andrea
    • Denis Loïc
    IEEE Transactions on Geoscience and Remote Sensing, Institute of Electrical and Electronics Engineers, 2025, 63, pp.1-12. Synthetic Aperture Radar (SAR) images offer crucial information for studying and monitoring sea ice in the Arctic. Sentinel-1 captures images of the area using an extremely wide swath for reduced revisit time. The backscattered signal from sea ice and open water is often very weak, making it difficult to distinguish from the sensor thermal noise floor. Thermal noise impacts the images by generating a bias and increasing the fluctuations related to speckle phenomenon. Analyzing these images requires both correcting this bias and reducing fluctuations without blurring out the image content. The acquisition of several sub-swaths in a single pass using Terrain Observation with Progressive Scans (TOPS) produces images that exhibit, after compensation for antenna gains, a non-uniform thermal noise floor and strong discontinuities between sub-swaths. Denoising techniques must take these specificities into account to restore the images. This paper introduces a joint approach to remove the thermal noise offset and suppress fluctuations due to speckle and thermal noise. Compensating at once for all these effects largely reduces artifacts at the boundary between sub-swaths. We demonstrate using both numerical simulations and actual Sentinel-1 images that debiased polarimetric reflectivities can be recovered and fluctuations strongly reduced while preserving fine spatial structures. (10.1109/TGRS.2025.3610502)
    DOI : 10.1109/TGRS.2025.3610502
  • Learning and certification of local time-dependent quantum dynamics and noise
    • França Daniel Stilck
    • Möbus Tim
    • Rouzé Cambyse
    • Werner Albert
    , 2025. Hamiltonian learning protocols are essential tools to benchmark quantum computers and simulators. Yet rigorous methods for time-dependent Hamiltonians and Lindbladians remain scarce despite their wide use. We close this gap by learning the time-dependent evolution of a locally interacting $n$-qubit system on a graph of effective dimension $D$ using only preparation of product Pauli eigenstates, evolution under the time-dependent generator for given times, and measurements in product Pauli bases. We assume the time-dependent parameters are well approximated by functions in a known space of dimension $m$ admitting stable interpolation, e.g. by polynomials. Our protocol outputs functions approximating these coefficients to accuracy $ε$ on an interval with success probability $1-δ$, requiring only $O\big(ε^{-2}poly(m)\log(nδ^{-1})\big)$ samples and $poly(n,m)$ pre/postprocessing. Importantly, the scaling in $m$ is polynomial, whereas naive extensions of previous methods scale exponentially. The method estimates time derivatives of observable expectations via interpolation, yielding well-conditioned linear systems for the generator's coefficients. The main difficulty in the time-dependent setting is to evaluate these coefficients at finite times while preserving a controlled link between derivatives and dynamical parameters. Our innovation is to combine Lieb-Robinson bounds, process shadows, and semidefinite programs to recover the coefficients efficiently at constant times. Along the way, we extend state-of-the-art Lieb-Robinson bounds on general graphs to time-dependent, dissipative dynamics, a contribution of independent interest. These results provide a scalable tool to verify state-preparation procedures (e.g. adiabatic protocols) and characterize time-dependent noise in quantum devices.
  • Certifying and learning quantum Ising Hamiltonians
    • Bluhm Andreas
    • Caro Matthias
    • Gutiérrez Francisco Escudero
    • Oufkir Aadil
    • Rouzé Cambyse
    , 2025. In this work, we study the problems of certifying and learning quantum Ising Hamiltonians. Our main contributions are as follows: Certification of Ising Hamiltonians. We show that certifying an Ising Hamiltonian in normalized Frobenius norm via access to its time-evolution operator requires only $\widetilde O(1/\varepsilon)$ time evolution. This matches the Heisenberg-scaling lower bound of $Ω(1/\varepsilon)$ up to logarithmic factors. To our knowledge, this is the first nearly-optimal algorithm for testing a Hamiltonian property. A key ingredient in our analysis is the Bonami Lemma from Fourier analysis. Learning Ising Gibbs states. We design an algorithm for learning Ising Gibbs states in trace norm that is sample-efficient in all parameters. In contrast, previous approaches learned the underlying Hamiltonian (which implies learning the Gibbs state) but suffered from exponential sample complexity in the inverse temperature. Certification of Ising Gibbs states. We give an algorithm for certifying Ising Gibbs states in trace norm that is both sample and time-efficient, thereby solving a question posed by Anshu (Harvard Data Science Review, 2022). Finally, we extend our results on learning and certification of Gibbs states to general $k$-local Hamiltonians for any constant $k$.
  • Integrating Multi-Level Mixed-Criticality into MCTS for Robust Resource Management
    • Cordeiro Franco
    • Tardieu Samuel
    • Pautet Laurent
    Leibniz Transactions on Embedded Systems, 2025, 2 (10), pp.1-23. Managing actions with uncertain resource costs is a complex challenge, particularly in autonomous robot mission planning. Robots are often assigned multiple objectives with varying criticality levels, ranging from catastrophic to minor impacts, where failures can significantly affect system safety. Uncertainties in worst-case costs of resources, such as energy and operating time - the time it takes to carry out an action - further complicate mission planning and execution. Monte Carlo Tree Search (MCTS) is a powerful tool for online planning, yet it struggles to account for uncertainty in worst-case cost estimations. Optimistic estimates risk resource shortages, while pessimistic ones lead to inefficient allocation. The Mixed-Criticality (MC) approach, originally developed for real-time systems to schedule critical tasks by allocating processing resources under Worst-Case Execution Time (WCET) uncertainty, provides a framework of rules, models and design principles. We claim this framework can be adapted to autonomous robot mission planning, where critical objectives are met through analogous allocation of different kinds of resources such as energy and operating time despite uncertainties. We propose enhancing MCTS with MC principles to handle uncertainty in worst-case costs across multiple resources and criticality of objectives. High-critical objectives must always be completed, regardless of resource constraints, while low-critical objectives operate flexibly, consuming resources within optimistic estimates when possible or being discarded when resources become scarce. This ensures efficient resource reallocation and prioritization of high-critical objectives. To implement this, we present (MC)²TS, a novel variant of MCTS that integrates MC principles for dynamic resource management. It supports more than two criticality levels to ensure that the most critical components meet the most stringent safety and reliability requirements, while also enabling robust resource management. By enabling replanning and mode changes, (MC)²TS improves MCTS’s efficiency and enhances MC systems’ adaptability to both degrading and improving resource conditions. We evaluate (MC)²TS in an active perception scenario, where a drone retrieves data from distributed sensors under unpredictable environmental conditions. (MC)²TS outperforms MCTS by achieving more objectives, adapting plans when costs drop. It explores more objective sequences, minimizes oversizing, and enhances efficiency. Balancing safety and performance, it monitors robot battery, mission and objective resource constraints such as deadlines. Its robustness ensures low-critical objectives do not compromise high-critical objectives, making it a reliable solution for complex systems characterized by uncertain resource costs and critical objectives. (10.4230/LITES.10.2.1)
    DOI : 10.4230/LITES.10.2.1
  • Optimal quantum algorithm for Gibbs state preparation
    • Rouzé Cambyse
    • Stilck Franca Daniel
    • Alhambra Alvaro
    , 2024. It is of great interest to understand the thermalization of open quantum many-body systems, and how quantum computers are able to efficiently simulate that process. A recently introduced disispative evolution, inspired by existing models of open system thermalization, has been shown to be efficiently implementable on a quantum computer. Here, we prove that, at high enough temperatures, this evolution reaches the Gibbs state in time scaling logarithmically with system size. The result holds for Hamiltonians that satisfy the Lieb-Robinson bound, such as local Hamiltonians on a lattice, and includes long-range systems. To the best of our knowledge, these are the first results rigorously establishing the rapid mixing property of high-temperature quantum Gibbs samplers, which is known to give the fastest possible speed for thermalization in the many-body setting. We then employ our result to the problem of estimating partition functions at high temperature, showing an improved performance over previous classical and quantum algorithms.
  • University Rents Enabling Corporate Innovation: Mapping Academic Researcher Coding and Discursive Labour in the R Language Ecosystem
    • Cai Xiaolan
    • O'Neil Mathieu
    • Zacchiroli Stefano
    Journal of Quantitative Description: Digital Media, University of Zurich, 2025, 5. This article explores the role of unrecognised labour in corporate innovation systems via an analysis of researcher coding and discursive contributions to R, one of the largest statistical software ecosystems. Studies of online platforms typically focus on how platform affordances constrain participants' actions, and profit from their labour. We innovate by connecting the labour performed inside digital platforms to the professional employment of participants. Our case study analyses 8,924 R package repositories on GitHub, examining commits and communications. Our quantitative findings show that researchers, alongside non-affiliated contributors, are the most frequent owners of R package repositories and their most active contributors. Researchers are more likely to hold official roles compared to the average, and to engage in collaborative problem-solving and support work during package development. This means there is, underneath the 'recognised' category of star researchers who transition between academia and industry and secure generous funding, an 'unrecognised' category of researchers who not only create and maintain key statistical infrastructure, but also provide support to industry employees, for no remuneration. Our qualitative findings show how this unrecognised labour affects practitioners. Finally, our analysis of the ideology and practice of free, libre and open source software (FLOSS) shows how this ideology and practice legitimate the use of 'university rents' by Big Tech. (10.51685/jqd.2025.025)
    DOI : 10.51685/jqd.2025.025
  • Long-time asymptotics of noisy SVGD outside the population limit
    • Priser Victor
    • Bianchi Pascal
    • Salim Adil
    ICLR International Conference on Learning Representations, 2025. Stein Variational Gradient Descent (SVGD) is a widely used sampling algorithm that has been successfully applied in several areas of Machine Learning. SVGD operates by iteratively moving a set of interacting particles (which represent the samples) to approximate the target distribution. Despite recent studies on the complexity of SVGD and its variants, their long-time asymptotic behavior (i.e., after numerous iterations ) is still not understood in the finite number of particles regime. We study the long-time asymptotic behavior of a noisy variant of SVGD. First, we establish that the limit set of noisy SVGD for large is well-defined. We then characterize this limit set, showing that it approaches the target distribution as increases. In particular, noisy SVGD provably avoids the variance collapse observed for SVGD. Our approach involves demonstrating that the trajectories of noisy SVGD closely resemble those described by a McKean-Vlasov process.
  • Mathematical Foundations for Side-Channel Analysis of Cryptographic Systems
    • Cheng Wei
    • Guilley Sylvain
    • Rioul Olivier
    , 2025, pp.X-411. This book offers the reader a formalization, characterization and quantification of the real threat level posed by side-channel leaks from devices implementing cryptography. It exploits the best mathematical tools for quantifying information leakage and characterizing leakage-based attacks. The two possible approaches are described in detail. This includes the optimal attack strategy that can be derived (in specific contexts) or generic bounds regarding data complexity that can be computed. The tone of this book is essentially mathematical. It aims to establish formal foundations for techniques that are otherwise used as engineering recipes in industrial laboratories or empirical intuitions for deriving security levels from practical implementations. It is a systematization of knowledge and a compilation of relevant tools relating to the practice of side-channel analysis on embedded systems. This book provides an up-to-date and improved analysis and understanding of embedded devices that conceal secrets that can be extracted by an attacker. Typical attacks involve measuring the device's power consumption or radiated electromagnetic field. As a source of noisy information, this correlates it with secrets and enabling these secrets to be retrieved. The attacker in some cases, can purchase a blank device from the same series and learn about its leakage, particularly how it relates to the secrets. This book also covers how such information can enhance hardware attacks deployed on another device. Researchers and engineers working in the field of side-channel security for embedded systems and related countermeasures as well as hardware and software engineers focused on implementing cryptographic functionalities will want to purchase this book as a reference. Advanced-level students majoring in computer science and electrical engineering will find this book valuable as a secondary textbook. (10.1007/978-3-031-64399-6)
    DOI : 10.1007/978-3-031-64399-6
  • On the complexity of sabotage games for network security
    • Raju Dhananjay
    • Bakirtzis Georgios
    • Topcu Ufuk
    IEEE Transactions on Networking, ieee, 2025, 34, pp.2897-2910. (10.1109/TON.2025.3628015)
    DOI : 10.1109/TON.2025.3628015
  • White matter hyperintensities and their role in major depressive episodes: a cross-sectional study in adults under 65
    • Baudouin Édouard
    • Corruble Emmanuelle
    • Gori Pietro
    • Bloch Isabelle
    • Becquemont Laurent
    • Duron Emmanuelle
    • Colle Romain
    Brazilian Journal of Psychiatry, Brazilian Psychiatric Association, 2025.
  • Why honor heroes? The emergence of extreme altruistic behavior as a by-product of praisers' self-promotion
    • Dessalles Jean-Louis
    Evolution and Human Behavior, Elsevier, 2025, 46 (1), pp.106656. Heroes are people who perform costly altruistic acts. Few people turn out to be heroes, but many spontaneously honor heroes by commenting, applauding, or enthusiastically celebrating their deeds. The existence of a praising audience leads individuals to compete to attract the crowd's admiration. The outcome is a winner-take-all situation in which only one or a few individuals engage in extreme altruistic behavior. The more difficult part is to explain the crowd's propensity to pay tribute from an individual fitness optimization perspective. The model proposed here shows how heroic behavior and its celebration by a large audience may emerge together. This situation is possible if admirers use public praise as a social signal to promote their own commitment to the values displayed by the hero. (10.1016/j.evolhumbehav.2025.106656)
    DOI : 10.1016/j.evolhumbehav.2025.106656
  • Réélaboration des règles de sécurité en contextes interpersonnels : le cas du toucher en temps de Covid‑19
    • Héron Robin
    • Safin Stéphane
    • Baker Michael J
    • Zhang Zhuoming
    • Alvina Jessalyn
    • Lecolinet Éric
    • Détienne Françoise
    Activités, Association Recherches et Pratiques sur les ACTivités, 2025, 22-1. (10.4000/13ra9)
    DOI : 10.4000/13ra9
  • Resilience for Regular Path Queries: Towards a Complexity Classification
    • Amarilli Antoine
    • Gatterbauer Wolfgang
    • Makhija Neha
    • Monet Mikaël
    Proceedings of the ACM on Management of Data, ACM, 2025, 3 (2), pp.1-18. The resilience problem for a query and an input set or bag database is to compute the minimum number of facts to remove from the database to make the query false. In this paper, we study how to compute the resilience of Regular Path Queries (RPQs) over graph databases. Our goal is to characterize the regular languages $L$ for which it is tractable to compute the resilience of the existentially-quantified RPQ built from $L$. We show that computing the resilience in this sense is tractable (even in combined complexity) for all RPQs defined from so-called local languages. By contrast, we show hardness in data complexity for RPQs defined from the following language classes (after reducing the languages to eliminate redundant words): all finite languages featuring a word containing a repeated letter, and all languages featuring a specific kind of counterexample to being local (which we call four-legged languages). The latter include in particular all languages that are not star-free. Our results also imply hardness for all non-local languages with a so-called neutral letter. We also highlight some remaining obstacles towards a full dichotomy. In particular, for the RPQ $abc|be$, resilience is tractable but the only PTIME algorithm that we know uses submodular function optimization. (10.1145/3725245)
    DOI : 10.1145/3725245
  • Computational aspects of the trace norm contraction coefficient
    • Delsol Idris
    • Fawzi Omar
    • Kochanowski Jan
    • Ramachandran Akshay
    , 2025. We show that approximating the trace norm contraction coefficient of a quantum channel within a constant factor is NP-hard. Equivalently, this shows that determining the optimal success probability for encoding a bit in a quantum system undergoing noise is NP-hard. This contrasts with the classical analogue of this problem that can clearly be solved efficiently. We also establish the NP-hardness of deciding if the contraction coefficient is equal to 1, i.e., the channel can perfectly preserve a bit. As a consequence, deciding if a non-commutative graph has an independence number of at least 2 is NPhard. In addition, we establish a converging hierarchy of semidefinite programming upper bounds on the contraction coefficient.
  • Additivity and chain rules for quantum entropies via multi-index Schatten norms
    • Fawzi Omar
    • Kochanowski Jan
    • Rouzé Cambyse
    • van Himbeeck Thomas
    , 2025. The primary entropic measures for quantum states are additive under the tensor product. In the analysis of quantum information processing tasks, the minimum entropy of a set of states, e.g., the minimum output entropy of a channel, often plays a crucial role. A fundamental question in quantum information and cryptography is whether the minimum output entropy remains additive under the tensor product of channels. Here, we establish a general additivity statement for the optimized sandwiched Rényi entropy of quantum channels. For that, we generalize the results of [Devetak, Junge, King, Ruskai, CMP 2006] to multi-index Schatten norms. As an application, we strengthen the additivity statement of [Van Himbeeck and Brown, 2025] thus allowing the analysis of time-adaptive quantum cryptographic protocols. In addition, we establish chain rules for Rényi conditional entropies that are similar to the ones used for the generalized entropy accumulation theorem of [Metger, Fawzi, Sutter, Renner, CMP 2024].
  • Device Independent Quantum Key Activation
    • Ulu Bora
    • Brunner Nicolas
    • Weilenmann Mirjam
    Physical Review Letters, American Physical Society, 2025, 135 (19), pp.190801. Device-independent quantum key distribution (DIQKD) allows two distant parties to establish a secret key, based only on the observed Bell nonlocal distribution. It remains however, unclear what the minimal resources for enabling DIQKD are and how to maximize the key rate from a given distribution. In the present work, we consider a scenario where several copies of a given quantum distribution are jointly processed via a local and classical wiring operation. We find that, under few assumptions, it is possible to activate device-independent key. That is, starting from a distribution that is useless in a DIQKD protocol, we obtain a positive key rate by wiring several copies together. We coin this effect device-independent key activation. Our analysis focuses on the standard DIQKD protocol with one-way post-processing, and we resort to semi-definite programming techniques for computing lower bounds on the key rate. (10.1103/f8jc-q1kg)
    DOI : 10.1103/f8jc-q1kg
  • Routing Quantum Control of Causal Order
    • Grothus Maarten
    • Abbott Alastair A.
    • Vanrietvelde Augustin
    • Branciard Cyril
    , 2025. In recent years, various frameworks have been proposed for the study of quantum processes with indefinite causal order. In particular, quantum circuits with quantum control of causal order (QC-QCs) form a broad class of physical supermaps obtained from a bottom-up construction and are believed to represent all quantum processes physically realisable in a fixed spacetime. Complementarily, the formalism of routed quantum circuits introduces quantum operations constrained by "routes" to represent processes in terms of a more fine-grained routed circuit decomposition. This decomposition, formalised using a so-called routed graph, represents the information flow within the respective process. However, the existence of routed circuit decompositions has only been established for a small set of processes so far, including both certain specific QC-QCs and more exotic processes as examples. In this work, we remedy this fact by connecting these two frameworks. We prove that for any given $N$, one can use a single routed graph to systematically obtain a routed circuit decomposition for any QC-QC with $N$ parties. We detail this construction explicitly and contrast it with other routed circuit decompositions of QC-QCs, which we obtain from alternative routed graphs. We conclude by pointing out how this connection can be useful to tackle various open problems in the field of indefinite causal order, particularly establishing circuit representations of subclasses of QC-QCs.
  • Complexity of mixed Schatten norms of quantum maps
    • Kochanowski Jan
    • Fawzi Omar
    • Rouzé Cambyse
    , 2025. We study the complexity of computing the mixed Schatten $\|Φ\|_{q\to p}$ norms of linear maps $Φ$ between matrix spaces. When $Φ$ is completely positive, we show that $\| Φ\|_{q \to p}$ can be computed efficiently when $q \geq p$. The regime $q \geq p$ is known as the non-hypercontractive regime and is also known to be easy for the mixed vector norms $\ell_{q} \to \ell_{p}$ [Boyd, 1974]. However, even for entanglement-breaking completely-positive trace-preserving maps $Φ$, we show that computing $\| Φ\|_{1 \to p}$ is $\mathsf{NP}$-complete when $p&gt;1$. Moving beyond the completely-positive case and considering $Φ$ to be difference of entanglement breaking completely-positive trace-preserving maps, we prove that computing $\| Φ\|^+_{1 \to 1}$ is $\mathsf{NP}$-complete. In contrast, for the completely-bounded (cb) case, we describe a polynomial-time algorithm to compute $\|Φ\|_{cb,1\to p}$ and $\|Φ\|^+_{cb,1\to p}$ for any linear map $Φ$ and $p\geq1$.
  • Self-Supervision Enhances Instance-based Multiple Instance Learning Methods in Digital Pathology: A Benchmark Study
    • Mammadov Ali
    • Le Folgoc Loic
    • Adam Julien
    • Buronfosse Anne
    • Hayem Gilles
    • Hocquet Guillaume
    • Gori Pietro
    Journal of Medical Imaging, SPIE Digital Library, 2025. Multiple Instance Learning (MIL) has emerged as the best solution for Whole Slide Image (WSI) classification. It consists of dividing each slide into patches, which are treated as a bag of instances labeled with a global label. MIL includes two main approaches: instance-based and embedding-based. In the former, each patch is classified independently, and then the patch scores are aggregated to predict the bag label. In the latter, bag classification is performed after aggregating patch embeddings. Even if instance-based methods are naturally more interpretable, embedding-based MILs have usually been preferred in the past due to their robustness to poor feature extractors. However, recently, the quality of feature embeddings has drastically increased using self-supervised learning (SSL). Nevertheless, many authors continue to endorse the superiority of embedding-based MIL. To investigate this further, we conduct 710 experiments across 4 datasets, comparing 10 MIL strategies, 6 self-supervised methods with 4 backbones, 4 foundation models, and various pathology-adapted techniques. Furthermore, we introduce 4 instance-based MIL methods never used before in the pathology domain. Through these extensive experiments, we show that with a good SSL feature extractor, simple instance-based MILs, with very few parameters, obtain similar or better performance than complex, state-of-the-art (SOTA) embedding-based MIL methods, setting new SOTA results on the BRACS and Camelyon16 datasets. Since simple instance-based MIL methods are naturally more interpretable and explainable to clinicians, our results suggest that more effort should be put into well-adapted SSL methods for WSI rather than into complex embedding-based MIL methods.