2023-06-28 09:57:36 -04:00
..
2023-06-27 14:34:56 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00
2023-06-27 22:09:17 -04:00
2023-06-25 13:08:03 -04:00
2023-06-28 09:57:36 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00
2023-06-25 13:08:03 -04:00

Desktop

The Ollama desktop experience. This is an experimental, easy-to-use app for running models with ollama.

Download

  • macOS (Apple Silicon)
  • macOS (Intel Coming soon)
  • Windows (Coming soon)
  • Linux (Coming soon)

Running

In the background run the ollama server ollama.py server:

python ../ollama.py serve --port 7734

Then run the desktop app with npm start:

npm install
npm start

Coming soon

  • Browse the latest available models on Hugging Face and other sources
  • Keep track of previous conversations with models
  • Switch between models
  • Connect to remote Ollama servers to run models