From 94e1d96b2997bb2af4e729cd7217853234be31db Mon Sep 17 00:00:00 2001 From: James Braza Date: Mon, 18 Sep 2023 12:22:50 -0700 Subject: [PATCH] Updated README section on community projects for table (#550) --- README.md | 20 +++++++++++++------- 1 file changed, 13 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index dce37413..00610d83 100644 --- a/README.md +++ b/README.md @@ -206,10 +206,16 @@ curl -X POST http://localhost:11434/api/generate -d '{ ## Community Projects using Ollama -- [LangChain](https://python.langchain.com/docs/integrations/llms/ollama) and [LangChain.js](https://js.langchain.com/docs/modules/model_io/models/llms/integrations/ollama) with a question-answering [example](https://js.langchain.com/docs/use_cases/question_answering/local_retrieval_qa). -- [Continue](https://github.com/continuedev/continue) - embeds Ollama inside Visual Studio Code. The extension lets you highlight code to add to the prompt, ask questions in the sidebar, and generate code inline. -- [LiteLLM](https://github.com/BerriAI/litellm) a lightweight python package to simplify LLM API calls -- [Discord AI Bot](https://github.com/mekb-turtle/discord-ai-bot) - interact with Ollama as a chatbot on Discord. -- [Raycast Ollama](https://github.com/MassimilianoPasquini97/raycast_ollama) - Raycast extension to use Ollama for local llama inference on Raycast. -- [Simple HTML UI for Ollama](https://github.com/rtcfirefly/ollama-ui) -- [Emacs client](https://github.com/zweifisch/ollama) for Ollama +| Project | Description | +| -------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| [LangChain][1] and [LangChain.js][2] | Also, there is a question-answering [example][3]. | +| [Continue](https://github.com/continuedev/continue) | Embeds Ollama inside Visual Studio Code. The extension lets you highlight code to add to the prompt, ask questions in the sidebar, and generate code inline. | +| [LiteLLM](https://github.com/BerriAI/litellm) | Lightweight Python package to simplify LLM API calls. | +| [Discord AI Bot](https://github.com/mekb-turtle/discord-ai-bot) | Interact with Ollama as a chatbot on Discord. | +| [Raycast Ollama](https://github.com/MassimilianoPasquini97/raycast_ollama) | Raycast extension to use Ollama for local llama inference on Raycast. | +| [Simple HTML UI](https://github.com/rtcfirefly/ollama-ui) | Also, there is a Chrome extension. | +| [Emacs client](https://github.com/zweifisch/ollama) | | + +[1]: https://python.langchain.com/docs/integrations/llms/ollama +[2]: https://js.langchain.com/docs/modules/model_io/models/llms/integrations/ollama +[3]: https://js.langchain.com/docs/use_cases/question_answering/local_retrieval_qa