


{"id":6020,"date":"2025-12-17T11:31:58","date_gmt":"2025-12-17T10:31:58","guid":{"rendered":"https:\/\/pepr-cloud.fr\/fr\/?p=6020"},"modified":"2025-12-17T14:00:19","modified_gmt":"2025-12-17T13:00:19","slug":"les-llm-au-service-de-la-decouverte-de-relations-causales-dans-la-surveillance-des-environnements-informatiques-une-etude-comparative","status":"publish","type":"post","link":"https:\/\/pepr-cloud.fr\/en\/francais-les-llm-au-service-de-la-decouverte-de-relations-causales-dans-la-surveillance-des-environnements-informatiques-une-etude-comparative\/","title":{"rendered":"(Fran\u00e7ais) Les LLM au service de la d\u00e9couverte de relations causales dans la surveillance des environnements informatiques : une \u00e9tude comparative"},"content":{"rendered":"<p class=\"qtranxs-available-languages-message qtranxs-available-languages-message-en\">Sorry, this entry is only available in <a href=\"https:\/\/pepr-cloud.fr\/fr\/wp-json\/wp\/v2\/posts\/6020\" class=\"qtranxs-available-language-link qtranxs-available-language-link-fr\" title=\"Fran\u00e7ais\">Fran\u00e7ais<\/a>.<\/p><div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n<div class=\"wp-block-image animate__animated animate__fadeInDown\">\n<figure class=\"aligncenter size-large is-resized\"><a href=\"https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2.jpg\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"381\" src=\"https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-1024x381.jpg\" alt=\"\" class=\"wp-image-6023\" style=\"width:1500px\" srcset=\"https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-1024x381.jpg 1024w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-300x112.jpg 300w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-768x285.jpg 768w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-1536x571.jpg 1536w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-150x56.jpg 150w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2-1320x491.jpg 1320w, https:\/\/pepr-cloud.fr\/files\/2025\/12\/2025_banner-2.jpg 1875w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/figure><\/div>\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p class=\"has-black-color has-text-color\"><strong><em>\u00c0 l\u2019\u00e8re des infrastructures informatiques toujours plus complexes, la surveillance efficace de ces environnements distribu\u00e9s et h\u00e9t\u00e9rog\u00e8nes reste un d\u00e9fi majeur. Une nouvelle \u00e9tude, pr\u00e9sent\u00e9e lors de la session sp\u00e9ciale <em>LLM-Net<\/em> de la conf\u00e9rence <strong>IEEE LCN 2025<\/strong> \u00e0 Sydney, explore le potentiel des <strong>mod\u00e8les de langage (LLM)<\/strong> \u2014 open source et commerciaux \u2014 pour identifier des relations causales entre les m\u00e9triques de surveillance. En comparant leurs performances avec des algorithmes traditionnels comme <strong>PCMCI+<\/strong>, cette recherche ouvre la voie \u00e0 une surveillance plus intelligente et \u00e9conome en ressources, notamment dans le cadre du projet <strong>PEPR Cloud SPIREC<\/strong>.<\/em><\/strong><\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-cyan-bluish-gray-color has-css-opacity has-cyan-bluish-gray-background-color has-background aligncenter is-style-wide\"\/>\n\n\n\n<h2 class=\"wp-block-heading\" style=\"font-size:25px\"><strong><strong>1. Contexte et enjeux<\/strong><\/strong><\/h2>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>La surveillance des environnements informatiques modernes, marqu\u00e9s par leur <strong>distribution, leur h\u00e9t\u00e9rog\u00e9n\u00e9it\u00e9 et leur \u00e9volutivit\u00e9<\/strong>, se heurte \u00e0 un obstacle de taille : l\u2019absence de connaissances pr\u00e9alables sur les m\u00e9triques les plus critiques. Pourtant, ces m\u00e9triques sont souvent li\u00e9es par des <strong>relations causales<\/strong> invisibles \u00e0 l\u2019\u0153il nu. C\u2019est dans ce contexte que les <strong>Large Language Models (LLM)<\/strong> \u00e9mergent comme une solution prometteuse, capables d\u2019analyser et de raisonner sur des donn\u00e9es complexes gr\u00e2ce \u00e0 leur ma\u00eetrise du langage naturel.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" style=\"font-size:25px\"><strong><strong><strong>2. Objectifs de l\u2019\u00e9tude<\/strong><\/strong><\/strong><\/h2>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Pr\u00e9sent\u00e9e lors de la session <em>LLM-Net<\/em> de la <strong>conf\u00e9rence IEEE LCN 2025<\/strong> (Sydney, Australie), cette publication \u00e9value la capacit\u00e9 des LLM \u2014 <strong>open source et commerciaux<\/strong> \u2014 \u00e0 d\u00e9couvrir des liens causaux entre les m\u00e9triques de surveillance. L\u2019\u00e9tude compare leurs performances avec celles d\u2019un algorithme traditionnel (<strong>PCMCI+<\/strong>), en testant diff\u00e9rents styles de <em>prompts<\/em> pour optimiser les r\u00e9sultats.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" style=\"font-size:25px\"><strong><strong><strong><strong>3. M\u00e9thodologie et r\u00e9sultats<\/strong><\/strong><\/strong><\/strong><\/h2>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Les chercheurs ont exploit\u00e9 les capacit\u00e9s de <strong>raisonnement causal<\/strong> des LLM pour g\u00e9n\u00e9rer des scores indiquant l\u2019importance relative des m\u00e9triques. Ces scores, int\u00e9gr\u00e9s \u00e0 un <strong>agent d\u2019apprentissage par renforcement (RL)<\/strong>, permettent de :<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>R\u00e9duire la surcharge de calcul<\/strong> en ciblant les m\u00e9triques les plus informatives.<\/li>\n\n\n\n<li><strong>\u00c9conomiser les ressources<\/strong> en \u00e9vitant une surveillance exhaustive et souvent redondante.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" style=\"font-size:25px\"><strong><strong><strong><strong><strong>4. Applications et perspectives<\/strong><\/strong><\/strong><\/strong><\/strong><\/h2>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Ce travail s\u2019inscrit dans le cadre du <strong>projet PEPR Cloud SPIREC (lot 1)<\/strong>, o\u00f9 l\u2019intelligence de la surveillance est renforc\u00e9e par l\u2019apport de connaissances causales pr\u00e9alables. \u00c0 terme, cette approche pourrait r\u00e9volutionner la gestion des infrastructures informatiques, en rendant la surveillance <strong>plus agile, moins co\u00fbteuse et plus durable<\/strong>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-cyan-bluish-gray-color has-css-opacity has-cyan-bluish-gray-background-color has-background aligncenter is-style-wide\"\/>\n\n\n\n<p><em>Plus d\u2019infos : <a href=\"https:\/\/hal.science\/hal-05343924\" data-type=\"link\" data-id=\"https:\/\/hal.science\/hal-05343924\">consultez le papier complet ici<\/a><\/em><\/p>\n\n\n\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>Sorry, this entry is only available in Fran\u00e7ais. \u00c0 l\u2019\u00e8re des infrastructures informatiques toujours plus complexes, la surveillance efficace de ces environnements distribu\u00e9s et h\u00e9t\u00e9rog\u00e8nes reste un d\u00e9fi majeur. Une nouvelle \u00e9tude, pr\u00e9sent\u00e9e lors de la session sp\u00e9ciale LLM-Net de la conf\u00e9rence IEEE LCN 2025 \u00e0 Sydney, explore le potentiel\u2026<\/p>\n<p> <a class=\"continue-reading-link\" href=\"https:\/\/pepr-cloud.fr\/en\/francais-les-llm-au-service-de-la-decouverte-de-relations-causales-dans-la-surveillance-des-environnements-informatiques-une-etude-comparative\/\"><span>Continuer la lecture<\/span><i class=\"crycon-right-dir\"><\/i><\/a> <\/p>\n","protected":false},"author":2468,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17],"tags":[],"class_list":["post-6020","post","type-post","status-publish","format-standard","hentry","category-interview"],"_links":{"self":[{"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/posts\/6020","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/users\/2468"}],"replies":[{"embeddable":true,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/comments?post=6020"}],"version-history":[{"count":13,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/posts\/6020\/revisions"}],"predecessor-version":[{"id":6038,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/posts\/6020\/revisions\/6038"}],"wp:attachment":[{"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/media?parent=6020"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/categories?post=6020"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/pepr-cloud.fr\/en\/wp-json\/wp\/v2\/tags?post=6020"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}