Comment on page
Ollama
The resource
ollama-configuration
configures access to the Ollama models for embeddings and completions via the Ollama REST API.configuration.yaml
configuration:
resources:
- type: "ollama-configuration"
name: "ollama"
configuration:
url: "${secrets.ollama.url}"
secrets.yaml
secrets:
- name: ollama
id: ollama
data:
url: "${OLLAMA_URL:-http://host.docker.internal:11434}"
This example uses host.docker.internal at the default Ollama port 11434,. This works well if you are running Ollama on your local machine with LangStream in Docker.
Last modified 14d ago