As a developer, you may encounter errors when running your application on Windows Subsystem for Linux (WSL). One
such error is httpx.ConnectError: [WinError 31, 2024]
. In this blog post, we will explore the cause of this
error and provide a step-by-step guide to fix it.
The Cause of the Error
The error occurs because the ollama daemon service file is not correctly configured to listen on all interfaces.
By default, ollama only listens on IPv6 interfaces, which can cause connectivity issues.
Step-by-Step Guide to Fix the Error
To fix this error, you need to create a custom configuration file for the ollama service. This file will override
the default settings and allow ollama to listen on all interfaces.
Here are the steps:
- Create a new directory: Run the command
sudo mkdir /etc/systemd/system/ollama.service.d
to create a new
directory calledollama.service.d
. - Create a new configuration file: Run the command
sudo vim /etc/systemd/system/ollama.service.d/http-host.conf
to create a new configuration file calledhttp-host.conf
.
This file will contain the custom settings for ollama. - Add the custom settings: In the
http-host.conf
file, add the following line:
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
This setting tells ollama to listen on all interfaces (IPv4 and IPv6).
- Reload the systemd daemon: Run the command
sudo systemctl daemon-reload
to reload the systemd daemon. - Restart the ollama service: Run the command
sudo systemctl restart ollama
to restart the ollama service.
Verification
To verify that ollama is now listening on all interfaces, run the command sudo systemctl status ollama
. You
should see logs indicating that ollama is listening on both IPv4 and IPv6 interfaces:
May 30 21:14:44 server1 sshd[697]: Server listening on 0.0.0.0 port 22.
May 30 21:14:44 server1 sshd[697]: Server listening on :: port 22.
Additional Tips
If you are running into this error and using WSL, take a look at Issue #1431. It turns out that despite tcp6 0 0 :::11434
being reported when binding to 0.0.0.0
, it was still actually bound to eth0
on IPv4 as well (a17x.x.x.x
for the WSL VM) at least in my case.
You can also set the environment variable OLLAMA_HOST=0.0.0:11434
in the override .conf file, including the
port.
Custom Client Solution
If you are experiencing issues with connecting to ollama using httpx, a custom client can be created to bypass
this issue. Here is an example of how to do it:
from ollama import Client
# Create a custom client
client = Client(host='http://localhost:11434')
# Use the client to send a chat request
response = client.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
This custom client can be used to connect to ollama and send chat requests, bypassing the httpx.ConnectError: [WinError 31, 2024]
error.