Ollama credentials#
You can use these credentials to authenticate the following nodes:
Prerequisites#
Create and run an Ollama instance with one user. Refer to the Ollama Quick Start for more information.
Supported authentication methods#
- Instance URL
Related resources#
Refer to Ollama's API documentation for more information about the service.
View n8n's Advanced AI documentation.
Using instance URL#
To configure this credential, you'll need:
- The Base URL of your Ollama instance or remote authenticated Ollama instances.
- (Optional) The API Key for Bearer token authentication if connecting to a remote, authenticated proxy.
The default Base URL is http://localhost:11434
, but if you've set the OLLAMA_HOST
environment variable, enter that value. If you have issues connecting to a local n8n server, try 127.0.0.1
instead of localhost
.
If you're connecting to Ollama through authenticated proxy services (such as Open WebUI) you must include an API key. If you don't need authentication, leave this field empty. When provided, the API key is sent as a Bearer token in the Authorization
header of the request to the Ollama API.
Refer to How do I configure Ollama server? for more information.
Ollama and self-hosted n8n#
If you're self-hosting n8n on the same machine as Ollama, you may run into issues if they're running in different containers.
For this setup, open a specific port for n8n to communicate with Ollama by setting the OLLAMA_ORIGINS
variable or adjusting OLLAMA_HOST
to an address the other container can access.
Refer to Ollama's How can I allow additional web origins to access Ollama? for more information.