Sorry, you need to enable JavaScript to visit this website.
Partager

Publications

 

Les publications de nos enseignants-chercheurs sont sur la plateforme HAL :

 

Les publications des thèses des docteurs du LTCI sont sur la plateforme HAL :

 

Retrouver les publications figurant dans l'archive ouverte HAL par année :

2017

  • Asymmetrical Length Biasing for Energy Efficient Digital Circuits
    • Veirano Francisco
    • Naviner Lirida
    • Silveira Fernando
    , 2017.
  • Métasurfaces et antennes : nouvelles perspectives pour l’aéronautique et le spatial
    • Lepage A. C.
    • Begaud Xavier
    , 2017.
  • Infliximab quantitation in human plasma by liquid chromatography-tandem mass spectrometry: towards a standardization of the methods?
    • Jourdil Jean-François
    • Lebert Dorothee
    • Gautier-Veyret Élodie
    • Lemaitre Florian
    • Bonaz Bruno
    • Picard Guillaume
    • Tonini Julia
    • Stanke-Labesque Françoise
    Analytical and Bioanalytical Chemistry, Springer Verlag, 2017, 409 (5), pp.1195-1205. Infliximab (IFX) is a chimeric monoclonal antibody targeting tumor necrosis factor-alpha. It is currently approved for the treatment of certain rheumatic diseases or inflammatory bowel diseases. Clinical studies have suggested that monitoring IFX concentrations could improve treatment response. However, in most studies, IFX was quantified using ELISA assays, the resulting discrepancies of which raised concerns about their reliability. Here, we describe the development and validation of a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for IFX quantification in human plasma. Full-length stable-isotope-labeled antibody (SIL-IFX) was added to plasma samples as internal standard. Samples were then prepared using Mass Spectrometry Immuno Assay (MSIA™) followed by trypsin digestion and submitted to multiple reaction monitoring (MRM) for quantification of IFX. The chromatographic run lasted 13 min. The range of quantification was 1 to 26 mg/L. For two internal quality controls spiked with 6 and 12 mg/L of IFX, the method was reproducible (coefficients of variation (CV%): 12.7 and 2.1), repeatable (intra-day CV%: 5.5 and 5.0), and accurate (inter-day and intra-day deviations from nominal values: +6.4 to +3.7 % and 5.5 to 9.2 %, respectively). There was no cross - contamination effect. Samples from 45 patients treated with IFX were retrospectively analyzed by LC-MS/MS and results were compared to those obtained with an in-house ELISA assay and the commercial Lisa Tracker® method. Good agreement was found between LC-MS/MS and in-house ELISA (mean underestimation of 13 % for in-house ELISA), but a significant bias was found with commercial ELISA (mean underestimation of 136 % for commercial ELISA). This method will make it possible to standardize IFX quantification between laboratories. Graphical Abstract Interassay comparison of the three methods: LC-MS/MS vs inhouse ELISA assay or vs Lisa Tracker® ELISA assays, Passing & Bablok (a) and Bland & Altman (b) for the comparison of LC-MS/MS vs in-house ELISA assay; Passing & Bablok (10.1007/s00216-016-0045-4)
    DOI : 10.1007/s00216-016-0045-4
  • Challenges, Solutions and Implications for Large Scale Enterprise Systems: OR in the Age of Big Data
    • Hudry Olivier
    , 2017.
  • Low profile superstrate using Transformation Optics for semicircular radiation pattern of antenna
    • Joshi Chetan
    • Lepage A. C.
    • Begaud Xavier
    Applied physics. A, Materials science & processing, Springer Verlag, 2017, 123 (2). In this article, a dielectric superstrate inspired from transformation optics is presented. When placed over a patch antenna, this superstrate increases the half power beam width (HPBW) of a classical patch antenna. An appropriate spatial transformation relation with spatial compression and refractive index shift factors has been used to derive an expression for a dielectric material profile. The wave front exiting from the transformed space is optimized for a semicylindrical shape. Then, a discretized version of this profile has been used to design a cuboidal superstrate. Full wave simulations have been presented that essentially show a superstrate device capable of producing a 297° of HPBW in H-plane with a peak directivity of 3.2 dBi at the design frequency. The derived solution can be realized using the standard dielectric materials for real-world applications. (10.1007/s00339-017-0787-7)
    DOI : 10.1007/s00339-017-0787-7
  • Nonnegative Matrix Factorisation for multimodal data analysis
    • Essid Slim
    , 2017.
  • Privacy Preserving Biometric Identity Verification
    • Chollet Gérard
    • Jimenez Abelino
    • Petrovska-Delacrétaz Dijana
    • Raj Bhiksha
    , 2017.
  • Set of tuples expansion by example with reliability
    • Er Ngurah Agus Sanjaya
    • Ba Mouhamadou Lamine
    • Abdessalem Talel
    • Bressan Stéphane
    International Journal of Web Information Systems (IJWIS), 2017, 13 (4), pp.425-444. <p>This paper aims to focus on the design of algorithms and techniques for an effective set expansion. A tool that finds and extracts candidate sets of tuples from the World Wide Web was designed and implemented. For instance, when a given user provides , , as seeds, our system returns tuples composed of countries with their corresponding capital cities and currency names constructed from content extracted from Web pages retrieved.</p> <p> </p> <p>The seeds are used to query a search engine and to retrieve relevant Web pages. The seeds are also used to infer wrappers from the retrieved pages. The wrappers, in turn, are used to extract candidates. The Web pages, wrappers, seeds and candidates, as well as their relationships, are vertices and edges of a heterogeneous graph. Several options for ranking candidates from PageRank to truth finding algorithms were evaluated and compared. Remarkably, all vertices are ranked, thus providing an integrated approach to not only answer direct set expansion questions but also find the most relevant pages to expand a given set of seeds.</p> <p> </p> <p>The experimental results show that leveraging the truth finding algorithm can indeed improve the level of confidence in the extracted candidates and the sources.</p> <p> </p> <p>Current approaches on set expansion mostly support sets of atomic data expansion. This idea can be extended to the sets of tuples and extract relation instances from the Web given a handful set of tuple seeds. A truth finding algorithm is also incorporated into the approach and it is shown that it can improve the confidence level in the ranking of both candidates and sources in set of tuples expansion.</p> (10.1108/IJWIS-04-2017-0037)
    DOI : 10.1108/IJWIS-04-2017-0037
  • Beam steering in quantum cascade lasers with optical feedback
    • Jumpertz Louise
    • Ferré Simon
    • Carras Mathieu
    • Grillot Frédéric
    , 2017.
  • Practical metrics for evaluation of fault-tolerant logic design
    • Stempkovskiy Alexandre
    • Telpukhov Dmitry
    • Solovyev Roman
    • Balaka Ekaterina
    • Naviner Lirida
    , 2017, pp.569-573.
  • Optimal Distributed Channel Assignment in D2D Networks Using Learning in Noisy Potential Games
    • Coupechoux Marceau
    , 2017.
  • Security-Aware Modeling and Analysis for HW/SW Partitioning
    • Li Letitia W.
    • Lugou Florian
    • Apvrille Ludovic
    , 2017. The rising wave of attacks on communicating embedded systems has exposed their users to risks of informa- tion theft, monetary damage, and personal injury. Through improved modeling and analysis of security, we propose that these flaws could be mitigated. Since HW/SW partitioning, one of the first phases, impacts future integration of security into the system, this phase would benefit from supporting modeling security abstrac- tions and security properties, providing designers with useful partitioning feedback obtained from a security formal analyzer. In this paper, we present how our toolkit supports security modeling, automated security integration, and formal analysis during the HW/SW partitioning phase for secure communications in embedded systems. We introduce “Cryptographic Configurations”, an abstract representation of security that allows us to verify security formally. Our toolkit further assists designers by automatically adding these security representations based on a mapping and security requirements.
  • Simulation and design of a multistage 10W Thulium-doped double clad silica fiber amplifier at 2050nm
    • Romano Clément
    • Tench Robert E
    • Jaouën Yves
    • Williams Glen M
    , 2017. A careful comparison of experiment and theory is important both for basic research and systematic engineering design of Thulium fiber amplifiers operating in the 2 µm region for applications such as LIDAR or spectroscopy (e.g. CO2 atmospheric absorption at 2051.4 nm). In this paper we report the design and performance of a multistage high-power PM Tm-doped fiber amplifier, cladding pumped at 793 nm. The design is the result of a careful comparison of numerical simulation, based on a three level model including ion-ion interactions, and experiment. Our simulation model is based on precise measurements of the cross sections and other parameters for both 6 and 10 µm core diameter fibers. Good agreement for several single and multistage amplifier topologies and operating conditions will be presented. Origins of the difference between theory and experiment are discussed, with emphasis on the accuracy of the cross sections and the cross relaxation parameters. Finally based on our simulation tool, we will demonstrate a design with an output power greater than 10 W for a multistage amplifier with a single-frequency signal at 2050 nm. The power stage was constructed with a 6 µm active fiber showing a 64 % optical slope efficiency. The output power is found to be within 5 % of the simulated results and is limited only by the available launched pump power of ~24 W. No stimulated Brillouin scattering is observed at the highest output power level for an active fiber well thermalized.
  • Un Modèle de Factorisation de Poisson pour la Recommandation de Points d'Intérêt
    • Griesner Jean-Benoît
    • Abdessalem Talel
    • Naacke Hubert
    , 2017, pp.411-416. L'explosion des volumes de données circulant sur les réseaux sociaux géo-localisés (LBSN) rend possible l'extraction des préférences des utilisateurs. En particulier ces préférences peuvent être utilisées pour recommander à l'utilisateur des points d'intérêt en adéquation avec son profil. Aujourd'hui la recommandation de points d'intérêt est devenue une composante essentielle des LBSN. Malheureusement les méthodesde recommandation traditionnelles échouent à s'adapter aux contraintes propres aux LBSN, telles que la sparsité très élevée des données, ou prendre en compte l'influence géographique. Dans ce papier nous présentons un modèle de recommandation basée sur la factorisation de Poisson qui offre une solution efficace à ces contraintes. Nous avons testé notre modèle via des expérimentations sur un jeu de données réalisteissu du LBSN Foursquare. Ces expériences nous ont permis de démontrer une meilleure qualité de recommandation que 3 modèles de l'état de l'art.
  • Formal Specification and Verification of Security Guidelines
    • Zhioua Zeineb
    • Roudier Yves
    • Ameur-Boulifa R.
    , 2017, pp.267--273. (10.1109/PRDC.2017.51)
    DOI : 10.1109/PRDC.2017.51
  • Self-Healing Umbrella Sampling: Convergence and efficiency
    • Fort Gersende
    • Jourdain Benjamin
    • Lelièvre Tony
    • Stoltz Gabriel
    Statistics and Computing, Springer Verlag (Germany), 2017, 27 (1), pp.147–168. The Self-Healing Umbrella Sampling (SHUS) algorithm is an adaptive biasing algorithm which has been proposed to efficiently sample a multimodal probability measure. We show that this method can be seen as a variant of the well-known Wang-Landau algorithm. Adapting results on the convergence of the Wang-Landau algorithm, we prove the convergence of the SHUS algorithm. We also compare the two methods in terms of efficiency. We finally propose a modification of the SHUS algorithm in order to increase its efficiency, and exhibit some similarities of SHUS with the well-tempered metadynamics method. (10.1007/s11222-015-9613-2)
    DOI : 10.1007/s11222-015-9613-2
  • Versatile and efficient mixed–criticality scheduling for multi-core processors
    • Gratia Romain
    , 2017. This thesis focuses on the scheduling of mixed-criticality scheduling algorithms for multi-processors. The correctness of the execution of the real-time applications is ensured by a scheduler and is checked during the design phase. The execution platform sizing aims at minimising the number of processors required to ensure this correct scheduling. This sizing is impacted by the safety requirements. Indeed, these requirements tend to overestimate the execution times of the applications to ensure their correct executions. Consequently, the resulting sizing is costly. The mixed-criticality scheduling theory aims at proposing compromises on the guarantees of the execution of the applications to reduce this over-sizing. Several models of mixed-criticality systems offering different compromises have been proposed but all are based on the use of execution modes. Modes are ordered and tasks have non decreasing execution times in each mode. Yet, to reduce the sizing of the execution platform, only the execution of the most critical tasks is ensured. This model is called the discarding model. For simplicity reasons, most of the mixed-criticality scheduling algorithms are limited to this model. Besides, the most efficient scheduling policies for multi-processors entail too many preemptions and migrations to be actually used. Finally, they are rarely generalised to handle different models of mixed-criticality systems. However, the handling of more than two execution modes or of tasks with elastic periods would make such solutions more attractive for the industry. The approach proposed in this thesis is based on the separation of concerns between handling the execution modes and the scheduling of the tasks on the multi-processors. With this approach, we achieve to design an efficient scheduling policy that schedules different models of mixed-criticality systems. It consists in performing the transformation of a mixed-criticality task set into a non mixed-criticality one. We then schedule this task set by using an optimal hard real-time scheduling algorithm that entails few preemptions and migrations: RUN. We first apply our approach on the discarding model with two execution modes. The results show the efficiency of our approach for such model. Then, we demonstrate the versatility of our approach by scheduling systems of the discarding model with more than two execution modes. Finally, by using a method based on the decomposition of task execution, our approach can schedule systems based on elastic tasks.
  • Best of both worlds
    • Diamanti Eleni
    • Kashefi Elham
    Nature Physics, Nature Publishing Group [2005-....], 2017, 13 (1), pp.3-4. Secure communication is emerging as a significant challenge for our hyper-connected data-dependent society. The answer may lie in a clever combination of quantum and classical cryptographic techniques. (10.1038/nphys3972)
    DOI : 10.1038/nphys3972
  • Segmentation d’IRM de cerveaux de nouveau-nés en quelques secondes à l’aide d’un réseau de neurones convolutif pseudo-3D et de transfert d’apprentissage
    • Xu Yongchao
    • Géraud Thierry
    • Bloch Isabelle
    , 2017.
  • Détection de l’eau dans les images radar du futur satellite SWOT
    • Lobry Sylvain
    • Fjortoft Roger
    • Denis L.
    • Tupin Florence
    , 2017.
  • Organising Complexity: Hierarchies and Holarchies
    • Diaconescu Ada
    , 2017, pp.89-106.
  • Better Product Quality May Lead to Lower Product Price
    • Chenavaz Régis
    The B.E. Journal of Theoretical Economics, 2017, 17 (1), pp.1--22. This article analyzes the conditions under which better product quality implies higher or lower product price. In an optimal control framework, I make the following assumptions: The firm sets the dynamic pricing and product innovation policies; product innovation raises quality, which drives production cost, and consumers are sensitive to price and quality. I derive a rule of price-quality relationship that stresses the influence of quality on price through the effects of cost (positive), sales (negative), and markup (positive). This article shows that, while maximizing profit and despite a quality and cost increases, the firm may decrease product prices because of the possibility of generating more sales as a result of combining better quality with lower price. This sales effect solves the puzzle of a negative price-quality relationship. More generally, the sales effect mitigates the ability of price to convey information about quality. (10.1515/bejte-2015-0062)
    DOI : 10.1515/bejte-2015-0062
  • A Novel Range-Free Jammer Localization Solution in Wireless Network by Using PSO Algorithm
    • Pang Liang
    • Chen Xiao
    • Xue Zhi
    • Khatoun Rida
    , 2017, pp.198-211. (10.1007/978-981-10-6388-6_17)
    DOI : 10.1007/978-981-10-6388-6_17
  • Visual Menu Techniques
    • Bailly Gilles
    • Lecolinet Éric
    • Nigay Laurence
    ACM Computing Surveys, Association for Computing Machinery, 2017, 49 (4), pp.60. Menus are used for exploring and selecting commands in interactive applications. They are widespread in current systems and used by a large variety of users. As a consequence, they have motivated many studies in Human-Computer Interaction (HCI). Facing the large variety of menus, it is difficult to have a clear understanding of the design possibilities and to ascertain their similarities and differences. In this article, we address a main challenge of menu design: the need to characterize the design space of menus. To do this, we propose a taxonomy of menu properties that structures existing work on visual menus. In order to highlight the impact of the properties on performance, we begin by refining performance through a list of quality criteria and by reviewing existing analytical and empirical methods for quality evaluation. The taxonomy of menu properties is an unavoidable step toward the elaboration of advanced predictive models of menu performance and the optimization of menus. A key point of this work is to focus both on menus and on the properties of menus, to enable a fine-grained analysis in terms of performance. (10.1145/3002171)
    DOI : 10.1145/3002171
  • Mobilité résidentielle et sentiment d'attachement
    • Griffond-Boitier Anne
    • Valentin Jérôme
    , 2017, pp.237-253.