From f0425d3de96b04069273ab21c183a6c1142c40bd Mon Sep 17 00:00:00 2001 From: Jeffrey Morgan Date: Tue, 20 Feb 2024 20:44:45 -0500 Subject: [PATCH] Update faq.md --- docs/faq.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/faq.md b/docs/faq.md index b1883ce2..805f3fa4 100644 --- a/docs/faq.md +++ b/docs/faq.md @@ -113,9 +113,9 @@ If a different directory needs to be used, set the environment variable `OLLAMA_ Refer to the section [above](#how-do-i-configure-ollama-server) for how to set environment variables on your platform. -## Does Ollama send my prompts and answers back to Ollama.ai to use in any way? +## Does Ollama send my prompts and answers back to ollama.com? -No, Ollama runs entirely locally, and conversation data will never leave your machine. +No. Ollama runs locally, and conversation data does not leave your machine. ## How can I use Ollama in Visual Studio Code?