{"id":2141,"date":"2026-03-29T10:36:12","date_gmt":"2026-03-29T08:36:12","guid":{"rendered":"https:\/\/askem.eu\/?p=2141"},"modified":"2026-03-29T10:36:16","modified_gmt":"2026-03-29T08:36:16","slug":"ollama-executer-des-llm-en-local","status":"publish","type":"post","link":"https:\/\/askem.eu\/en\/2026\/03\/29\/ollama-executer-des-llm-en-local\/","title":{"rendered":"Ollama : ex\u00e9cuter des LLM en local"},"content":{"rendered":"\n<h1 class=\"wp-block-heading\">Ollama&nbsp;: ex\u00e9cuter des LLM en local pour son infrastructure open source<\/h1>\n\n\n\n<p>Ollama est un outil open source qui permet d&rsquo;ex\u00e9cuter des grands mod\u00e8les de langage (LLM) directement sur sa propre machine ou son serveur, sans d\u00e9pendre d&rsquo;API tierces comme OpenAI ou Anthropic. Il expose une API HTTP compatible avec le format OpenAI, ce qui facilite son int\u00e9gration dans une architecture existante \u2014 que ce soit un pipeline RAG, un serveur MCP, ou une interface de chat.<\/p>\n\n\n\n<p>Pour les infrastructures auto-h\u00e9berg\u00e9es, Ollama repr\u00e9sente une brique cl\u00e9&nbsp;: confidentialit\u00e9 des donn\u00e9es, absence de co\u00fbt par requ\u00eate, fonctionnement hors ligne. Cet article pr\u00e9sente son installation, son fonctionnement, et les cas d&rsquo;usage concrets dans une stack open source.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Pourquoi ex\u00e9cuter un LLM en local&nbsp;?<\/h2>\n\n\n\n<p>Les raisons sont multiples selon le contexte. La souverainet\u00e9 des donn\u00e9es est souvent la premi\u00e8re motivation&nbsp;: dans les organisations publiques ou les projets sensibles, envoyer des donn\u00e9es \u00e0 un service cloud tiers pose des questions r\u00e9glementaires (RGPD, confidentialit\u00e9, h\u00e9bergement en dehors de l&rsquo;UE). L&rsquo;ex\u00e9cution locale \u00e9limine ce risque.<\/p>\n\n\n\n<p>La ma\u00eetrise des co\u00fbts est un autre argument. Les API commerciales facturent \u00e0 la requ\u00eate ou au token. Pour un usage intensif \u2014 traitement de corpus documentaires, annotation automatique, extraction d&rsquo;entit\u00e9s \u2014 les co\u00fbts peuvent rapidement devenir significatifs. Un LLM local tourne \u00e0 co\u00fbt marginal nul une fois le mat\u00e9riel amorti.<\/p>\n\n\n\n<p>Enfin, l&rsquo;int\u00e9gration dans une infrastructure auto-h\u00e9berg\u00e9e est plus naturelle. Ollama s&rsquo;installe comme un service syst\u00e8me, expose une API REST, et s&rsquo;int\u00e8gre directement avec des outils comme LangChain, Open WebUI, ou un serveur MCP custom.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Installation d&rsquo;Ollama sur Linux<\/h2>\n\n\n\n<p>L&rsquo;installation est volontairement simple. Le script officiel g\u00e8re les d\u00e9pendances et installe Ollama comme service systemd&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">curl -fsSL https:\/\/ollama.com\/install.sh | sh\n<\/pre>\n\n\n\n<p>Apr\u00e8s l&rsquo;installation, le service tourne sur <code>http:\/\/localhost:11434<\/code>. On t\u00e9l\u00e9charge un mod\u00e8le avec une commande simple&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">ollama pull llama3.2\nollama pull mistral\nollama pull nomic-embed-text\n<\/pre>\n\n\n\n<p><code>nomic-embed-text<\/code> est un mod\u00e8le d&#8217;embeddings, utile pour les pipelines RAG. <code>llama3.2<\/code> et <code>mistral<\/code> sont des mod\u00e8les g\u00e9n\u00e9ralistes francophones performants sur du mat\u00e9riel grand public (8 \u00e0 16 Go de RAM suffisent pour les versions 7B quantis\u00e9es).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">L&rsquo;API REST compatible OpenAI<\/h2>\n\n\n\n<p>Ollama expose deux types d&rsquo;endpoints. L&rsquo;endpoint natif <code>\/api\/generate<\/code> est simple et direct&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">curl http:\/\/localhost:11434\/api\/generate \\\n  -d '{\"model\":\"llama3.2\",\"prompt\":\"Qu'est-ce que CKAN&nbsp;?\",\"stream\":false}'\n<\/pre>\n\n\n\n<p>L&rsquo;endpoint compatible OpenAI (<code>\/v1\/chat\/completions<\/code>) permet de substituer Ollama \u00e0 la place d&rsquo;OpenAI dans n&rsquo;importe quelle application qui utilise le SDK officiel Python ou TypeScript. Il suffit de changer la base URL&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">from openai import OpenAI\n\nclient = OpenAI(base_url=\"http:\/\/localhost:11434\/v1\", api_key=\"ollama\")\nresponse = client.chat.completions.create(\n    model=\"llama3.2\",\n    messages=[{\"role\": \"user\", \"content\": \"R\u00e9sume ce document&nbsp;: ...\"}]\n)\n<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">D\u00e9ploiement avec Docker<\/h2>\n\n\n\n<p>Pour une infrastructure conteneuris\u00e9e, Ollama fournit une image Docker officielle. Voici un <code>docker-compose.yml<\/code> minimal pour un d\u00e9ploiement CPU (sans GPU)&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">services:\n  ollama:\n    image: ollama\/ollama:latest\n    ports:\n      - \"11434:11434\"\n    volumes:\n      - ollama_data:\/root\/.ollama\n    restart: unless-stopped\n\nvolumes:\n  ollama_data:\n<\/pre>\n\n\n\n<p>Pour exposer Ollama derri\u00e8re un reverse proxy Nginx avec authentification, on ajoute une directive <code>proxy_pass<\/code> vers le port 11434 et un bloc <code>auth_basic<\/code> pour prot\u00e9ger l&rsquo;acc\u00e8s. En production, il est fortement d\u00e9conseill\u00e9 d&rsquo;exposer l&rsquo;API Ollama directement sur internet sans protection.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Int\u00e9gration dans un pipeline RAG<\/h2>\n\n\n\n<p>L&rsquo;un des cas d&rsquo;usage les plus courants est la construction d&rsquo;un pipeline RAG (Retrieval-Augmented Generation) enti\u00e8rement local. Le sch\u00e9ma est le suivant&nbsp;: les documents sont index\u00e9s sous forme de vecteurs d&#8217;embeddings (g\u00e9n\u00e9r\u00e9s par <code>nomic-embed-text<\/code> via Ollama), stock\u00e9s dans une base vectorielle comme Qdrant ou ChromaDB, puis interrog\u00e9s au moment de chaque requ\u00eate utilisateur avant d&rsquo;envoyer le contexte enrichi au LLM.<\/p>\n\n\n\n<p>Avec LangChain ou LlamaIndex, l&rsquo;int\u00e9gration d&rsquo;Ollama est directe \u2014 les deux biblioth\u00e8ques supportent nativement Ollama comme provider LLM et comme provider d&#8217;embeddings. On peut ainsi construire un assistant documentaire enti\u00e8rement souverain, capable d&rsquo;indexer des jeux de donn\u00e9es CKAN, des PDF d&rsquo;appels d&rsquo;offres, ou des corpus r\u00e9glementaires.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Int\u00e9gration avec un serveur MCP<\/h2>\n\n\n\n<p>Le Model Context Protocol (MCP) permet \u00e0 un LLM d&rsquo;appeler des outils externes lors de ses inf\u00e9rences. Ollama peut \u00eatre utilis\u00e9 comme backend LLM dans des configurations MCP via des frameworks comme <code>mcp-agent<\/code> ou des impl\u00e9mentations custom. Le pattern typique&nbsp;: un serveur MCP expose des outils (requ\u00eate SQL, appel API, lecture de fichier), et Ollama re\u00e7oit les r\u00e9sultats de ces outils pour construire sa r\u00e9ponse finale.<\/p>\n\n\n\n<p>Cette architecture permet de construire des agents autonomes qui tournent enti\u00e8rement en local, sans envoyer de donn\u00e9es \u00e0 un cloud tiers \u2014 un avantage majeur pour les cas d&rsquo;usage \u00e0 donn\u00e9es sensibles.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Interface web avec Open WebUI<\/h2>\n\n\n\n<p>Open WebUI est une interface de chat open source con\u00e7ue pour fonctionner avec Ollama. Elle propose une exp\u00e9rience proche de ChatGPT \u2014 gestion de conversations, s\u00e9lection de mod\u00e8les, upload de documents \u2014 mais enti\u00e8rement auto-h\u00e9berg\u00e9e. Le d\u00e9ploiement se fait via Docker&nbsp;:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">docker run -d \\\n  -p 3000:8080 \\\n  --add-host=host.docker.internal:host-gateway \\\n  -e OLLAMA_BASE_URL=http:\/\/host.docker.internal:11434 \\\n  ghcr.io\/open-webui\/open-webui:main\n<\/pre>\n\n\n\n<p>Open WebUI supporte \u00e9galement la connexion \u00e0 des API OpenAI tierces, ce qui permet de basculer entre un mod\u00e8le local et un mod\u00e8le cloud selon les besoins, depuis la m\u00eame interface.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Choix du mod\u00e8le selon les ressources mat\u00e9rielles<\/h2>\n\n\n\n<p>Le choix du mod\u00e8le d\u00e9pend directement du mat\u00e9riel disponible. Sur un serveur standard avec 8 Go de RAM et sans GPU d\u00e9di\u00e9, les mod\u00e8les 7B quantis\u00e9s (format GGUF Q4) fonctionnent de mani\u00e8re fluide. Sur 16 Go de RAM, on acc\u00e8de aux mod\u00e8les 13B. Pour des inf\u00e9rences rapides et des cas d&rsquo;usage en production, un GPU NVIDIA avec support CUDA acc\u00e9l\u00e8re consid\u00e9rablement les temps de r\u00e9ponse \u2014 Ollama d\u00e9tecte et utilise automatiquement le GPU si disponible.<\/p>\n\n\n\n<p>La biblioth\u00e8que de mod\u00e8les disponibles sur <a href=\"https:\/\/ollama.com\/library\" target=\"_blank\" rel=\"noreferrer noopener\">ollama.com\/library<\/a> couvre un large spectre&nbsp;: mod\u00e8les g\u00e9n\u00e9ralistes (Llama, Mistral, Gemma), mod\u00e8les de code (Codestral, DeepSeek Coder), mod\u00e8les d&#8217;embeddings, et mod\u00e8les multimodaux capables d&rsquo;analyser des images.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Points de vigilance en production<\/h2>\n\n\n\n<p>Plusieurs aspects m\u00e9ritent attention avant un d\u00e9ploiement en production. La s\u00e9curit\u00e9 de l&rsquo;API est la premi\u00e8re priorit\u00e9&nbsp;: Ollama n&rsquo;impl\u00e9mente pas d&rsquo;authentification native, il faut donc le placer derri\u00e8re un reverse proxy avec contr\u00f4le d&rsquo;acc\u00e8s. La gestion des mod\u00e8les stock\u00e9s sur disque (plusieurs Go par mod\u00e8le) n\u00e9cessite une strat\u00e9gie de volume claire. Enfin, les temps d&rsquo;inf\u00e9rence sur CPU restent significativement plus lents qu&rsquo;avec un GPU&nbsp;: \u00e0 \u00e9valuer selon les contraintes de latence de l&rsquo;application.<\/p>\n\n\n\n<p>Pour les environnements Kubernetes, Ollama peut \u00eatre d\u00e9ploy\u00e9 comme un Deployment avec un PersistentVolumeClaim pour le stockage des mod\u00e8les, et expos\u00e9 comme un Service interne accessible aux autres pods de la stack.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Ollama&nbsp;: ex\u00e9cuter des LLM en local pour son infrastructure open source Ollama est un outil open source qui permet d&rsquo;ex\u00e9cuter des grands mod\u00e8les de langage (LLM) directement sur sa propre machine ou son serveur, sans d\u00e9pendre d&rsquo;API tierces comme OpenAI ou Anthropic. Il expose une API HTTP compatible avec le format OpenAI, ce qui facilite [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2142,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ocean_post_layout":"","ocean_both_sidebars_style":"","ocean_both_sidebars_content_width":0,"ocean_both_sidebars_sidebars_width":0,"ocean_sidebar":"","ocean_second_sidebar":"","ocean_disable_margins":"enable","ocean_add_body_class":"","ocean_shortcode_before_top_bar":"","ocean_shortcode_after_top_bar":"","ocean_shortcode_before_header":"","ocean_shortcode_after_header":"","ocean_has_shortcode":"","ocean_shortcode_after_title":"","ocean_shortcode_before_footer_widgets":"","ocean_shortcode_after_footer_widgets":"","ocean_shortcode_before_footer_bottom":"","ocean_shortcode_after_footer_bottom":"","ocean_display_top_bar":"default","ocean_display_header":"default","ocean_header_style":"","ocean_center_header_left_menu":"","ocean_custom_header_template":"","ocean_custom_logo":0,"ocean_custom_retina_logo":0,"ocean_custom_logo_max_width":0,"ocean_custom_logo_tablet_max_width":0,"ocean_custom_logo_mobile_max_width":0,"ocean_custom_logo_max_height":0,"ocean_custom_logo_tablet_max_height":0,"ocean_custom_logo_mobile_max_height":0,"ocean_header_custom_menu":"","ocean_menu_typo_font_family":"","ocean_menu_typo_font_subset":"","ocean_menu_typo_font_size":0,"ocean_menu_typo_font_size_tablet":0,"ocean_menu_typo_font_size_mobile":0,"ocean_menu_typo_font_size_unit":"px","ocean_menu_typo_font_weight":"","ocean_menu_typo_font_weight_tablet":"","ocean_menu_typo_font_weight_mobile":"","ocean_menu_typo_transform":"","ocean_menu_typo_transform_tablet":"","ocean_menu_typo_transform_mobile":"","ocean_menu_typo_line_height":0,"ocean_menu_typo_line_height_tablet":0,"ocean_menu_typo_line_height_mobile":0,"ocean_menu_typo_line_height_unit":"","ocean_menu_typo_spacing":0,"ocean_menu_typo_spacing_tablet":0,"ocean_menu_typo_spacing_mobile":0,"ocean_menu_typo_spacing_unit":"","ocean_menu_link_color":"","ocean_menu_link_color_hover":"","ocean_menu_link_color_active":"","ocean_menu_link_background":"","ocean_menu_link_hover_background":"","ocean_menu_link_active_background":"","ocean_menu_social_links_bg":"","ocean_menu_social_hover_links_bg":"","ocean_menu_social_links_color":"","ocean_menu_social_hover_links_color":"","ocean_disable_title":"default","ocean_disable_heading":"default","ocean_post_title":"","ocean_post_subheading":"","ocean_post_title_style":"","ocean_post_title_background_color":"","ocean_post_title_background":0,"ocean_post_title_bg_image_position":"","ocean_post_title_bg_image_attachment":"","ocean_post_title_bg_image_repeat":"","ocean_post_title_bg_image_size":"","ocean_post_title_height":0,"ocean_post_title_bg_overlay":0.5,"ocean_post_title_bg_overlay_color":"","ocean_disable_breadcrumbs":"default","ocean_breadcrumbs_color":"","ocean_breadcrumbs_separator_color":"","ocean_breadcrumbs_links_color":"","ocean_breadcrumbs_links_hover_color":"","ocean_display_footer_widgets":"default","ocean_display_footer_bottom":"default","ocean_custom_footer_template":"","osh_disable_topbar_sticky":"default","osh_disable_header_sticky":"default","osh_sticky_header_style":"default","osh_sticky_header_effect":"","osh_custom_sticky_logo":0,"osh_custom_retina_sticky_logo":0,"osh_custom_sticky_logo_height":0,"osh_background_color":"","osh_links_color":"","osh_links_hover_color":"","osh_links_active_color":"","osh_links_bg_color":"","osh_links_hover_bg_color":"","osh_links_active_bg_color":"","osh_menu_social_links_color":"","osh_menu_social_hover_links_color":"","ocean_post_oembed":"","ocean_post_self_hosted_media":"","ocean_post_video_embed":"","ocean_link_format":"","ocean_link_format_target":"self","ocean_quote_format":"","ocean_quote_format_link":"post","ocean_gallery_link_images":"on","ocean_gallery_id":[],"footnotes":""},"categories":[16,18],"tags":[],"class_list":["post-2141","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai","category-devops","entry","has-media"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Ollama : ex\u00e9cuter des LLM en local - askem<\/title>\n<meta name=\"description\" content=\"ASKEM BUREAU D&#039;\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/askem.eu\/en\/2026\/03\/29\/ollama-executer-des-llm-en-local\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Ollama : ex\u00e9cuter des LLM en local - askem\" \/>\n<meta property=\"og:description\" content=\"ASKEM BUREAU D&#039;\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/askem.eu\/en\/2026\/03\/29\/ollama-executer-des-llm-en-local\/\" \/>\n<meta property=\"og:site_name\" content=\"askem\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/fb.me\/askem.eu\" \/>\n<meta property=\"article:published_time\" content=\"2026-03-29T08:36:12+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-03-29T08:36:16+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"1200\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"askemadmin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"askemadmin\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/\"},\"author\":{\"name\":\"askemadmin\",\"@id\":\"https:\\\/\\\/askem.eu\\\/#\\\/schema\\\/person\\\/8bbee74ab9a977d56bf4826662e9d2e9\"},\"headline\":\"Ollama : ex\u00e9cuter des LLM en local\",\"datePublished\":\"2026-03-29T08:36:12+00:00\",\"dateModified\":\"2026-03-29T08:36:16+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/\"},\"wordCount\":1094,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2026\\/03\\/sujet-askem-2026-03-29.png\",\"articleSection\":[\"AI\",\"devops\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/\",\"url\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/\",\"name\":\"Ollama : ex\u00e9cuter des LLM en local - askem\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2026\\/03\\/sujet-askem-2026-03-29.png\",\"datePublished\":\"2026-03-29T08:36:12+00:00\",\"dateModified\":\"2026-03-29T08:36:16+00:00\",\"description\":\"ASKEM BUREAU D'\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#primaryimage\",\"url\":\"https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2026\\/03\\/sujet-askem-2026-03-29.png\",\"contentUrl\":\"https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2026\\/03\\/sujet-askem-2026-03-29.png\",\"width\":1200,\"height\":1200},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/askem.eu\\\/2026\\\/03\\\/29\\\/ollama-executer-des-llm-en-local\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Accueil\",\"item\":\"https:\\\/\\\/askem.eu\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Ollama : ex\u00e9cuter des LLM en local\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/askem.eu\\\/#website\",\"url\":\"https:\\\/\\\/askem.eu\\\/\",\"name\":\"askem\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/askem.eu\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/askem.eu\\\/#organization\",\"name\":\"Askem\",\"url\":\"https:\\\/\\\/askem.eu\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/askem.eu\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\/\\/mlpi0fxo3sth.i.optimole.com\\/cb:3obA.c61\\/w:760\\/h:480\\/q:mauto\\/f:best\\/https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2020\\/10\\/logoGalaxieAskem3.png\",\"contentUrl\":\"https:\\/\\/mlpi0fxo3sth.i.optimole.com\\/cb:3obA.c61\\/w:760\\/h:480\\/q:mauto\\/f:best\\/https:\\/\\/askem.eu\\/wp-content\\/uploads\\/2020\\/10\\/logoGalaxieAskem3.png\",\"width\":760,\"height\":480,\"caption\":\"Askem\"},\"image\":{\"@id\":\"https:\\\/\\\/askem.eu\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/fb.me\\\/askem.eu\",\"https:\\\/\\\/linkedin.com\\\/company\\\/askem-eu\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/askem.eu\\\/#\\\/schema\\\/person\\\/8bbee74ab9a977d56bf4826662e9d2e9\",\"name\":\"askemadmin\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g\",\"caption\":\"askemadmin\"},\"sameAs\":[\"https:\\\/\\\/askem.eu\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Ollama : ex\u00e9cuter des LLM en local - askem","description":"ASKEM BUREAU D'\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/askem.eu\/en\/2026\/03\/29\/ollama-executer-des-llm-en-local\/","og_locale":"en_US","og_type":"article","og_title":"Ollama : ex\u00e9cuter des LLM en local - askem","og_description":"ASKEM BUREAU D'\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.","og_url":"https:\/\/askem.eu\/en\/2026\/03\/29\/ollama-executer-des-llm-en-local\/","og_site_name":"askem","article_publisher":"https:\/\/fb.me\/askem.eu","article_published_time":"2026-03-29T08:36:12+00:00","article_modified_time":"2026-03-29T08:36:16+00:00","og_image":[{"width":1200,"height":1200,"url":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png","type":"image\/png"}],"author":"askemadmin","twitter_card":"summary_large_image","twitter_misc":{"Written by":"askemadmin","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#article","isPartOf":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/"},"author":{"name":"askemadmin","@id":"https:\/\/askem.eu\/#\/schema\/person\/8bbee74ab9a977d56bf4826662e9d2e9"},"headline":"Ollama : ex\u00e9cuter des LLM en local","datePublished":"2026-03-29T08:36:12+00:00","dateModified":"2026-03-29T08:36:16+00:00","mainEntityOfPage":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/"},"wordCount":1094,"commentCount":0,"publisher":{"@id":"https:\/\/askem.eu\/#organization"},"image":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#primaryimage"},"thumbnailUrl":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png","articleSection":["AI","devops"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/","url":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/","name":"Ollama : ex\u00e9cuter des LLM en local - askem","isPartOf":{"@id":"https:\/\/askem.eu\/#website"},"primaryImageOfPage":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#primaryimage"},"image":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#primaryimage"},"thumbnailUrl":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png","datePublished":"2026-03-29T08:36:12+00:00","dateModified":"2026-03-29T08:36:16+00:00","description":"ASKEM BUREAU D'\u00c9TUDES ET DE FORMATION NUM\u00c9RIQUE. Nous vous assistons dans la transformation num\u00e9rique de vos outils, services et organisations tout en pla\u00e7ant l\u2019humain au c\u0153ur de notre d\u00e9marche d\u2019accompagnement.","breadcrumb":{"@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#primaryimage","url":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png","contentUrl":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2026\/03\/sujet-askem-2026-03-29.png","width":1200,"height":1200},{"@type":"BreadcrumbList","@id":"https:\/\/askem.eu\/2026\/03\/29\/ollama-executer-des-llm-en-local\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Accueil","item":"https:\/\/askem.eu\/"},{"@type":"ListItem","position":2,"name":"Ollama : ex\u00e9cuter des LLM en local"}]},{"@type":"WebSite","@id":"https:\/\/askem.eu\/#website","url":"https:\/\/askem.eu\/","name":"askem","description":"","publisher":{"@id":"https:\/\/askem.eu\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/askem.eu\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/askem.eu\/#organization","name":"Askem","url":"https:\/\/askem.eu\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/askem.eu\/#\/schema\/logo\/image\/","url":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:760\/h:480\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2020\/10\/logoGalaxieAskem3.png","contentUrl":"https:\/\/mlpi0fxo3sth.i.optimole.com\/cb:3obA.c61\/w:760\/h:480\/q:mauto\/f:best\/https:\/\/askem.eu\/wp-content\/uploads\/2020\/10\/logoGalaxieAskem3.png","width":760,"height":480,"caption":"Askem"},"image":{"@id":"https:\/\/askem.eu\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/fb.me\/askem.eu","https:\/\/linkedin.com\/company\/askem-eu"]},{"@type":"Person","@id":"https:\/\/askem.eu\/#\/schema\/person\/8bbee74ab9a977d56bf4826662e9d2e9","name":"askemadmin","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a202f744ee3a4b6fdbe2ceb57fd84c72559337791a276662270d8d2fb7842e3f?s=96&d=mm&r=g","caption":"askemadmin"},"sameAs":["https:\/\/askem.eu"]}]}},"_links":{"self":[{"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/posts\/2141","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/comments?post=2141"}],"version-history":[{"count":1,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/posts\/2141\/revisions"}],"predecessor-version":[{"id":2143,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/posts\/2141\/revisions\/2143"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/media\/2142"}],"wp:attachment":[{"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/media?parent=2141"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/categories?post=2141"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/askem.eu\/en\/wp-json\/wp\/v2\/tags?post=2141"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}