Reputation: 21
I created a server that runs Ollama using ngrok and Google colab.
!pip install aiohttp pyngrok
import os
import asyncio
from aiohttp import ClientSession
os.environ.update({'LD_LIBRARY_PATH': '/usr/lib64-nvidia'})
async def run(cmd):
'''
run is a helper function to run subcommands asynchronously.
'''
print('>>> starting', *cmd)
p = await asyncio.subprocess.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
)
async def pipe(lines):
async for line in lines:
print(line.strip().decode('utf-8'))
await asyncio.gather(
pipe(p.stdout),
pipe(p.stderr),
)
await asyncio.gather(
run(['ollama', 'serve']),
run(['ngrok', 'http', '--log', 'stderr', '11434', '--authtoken', '2h8g2xEIRTbSaXraFIRdsdpbACT_6HnpzGYxvEUFcbYyWPhYo', '--host-header="localhost:11434"']),
)
I tried to connect to it using windows client. the connection is fine but the problem is with the response, which the server always return a 403 error:
t=2024-05-29T23:40:22+0000 lvl=info msg="join connections" obj=join id=d124d435c3b1 l=127.0.0.1:11434 r=ip@ [GIN] 2024/05/29 - 23:40:22 | 403 | 295.834µs | ip@ | HEAD "/"
I looked for some solutions like changing OLLAMA_ORIGIN="" but it didn't work does Ollama serve requires some sort of authentication?
Upvotes: 2
Views: 1223
Reputation: 1
Under [Service] section of the .service file add:
Environment="OLLAMA_ORIGINS=*"
You mentioned OLLAMA_ORIGIN
, but it is OLLAMA_ORIGINS
and you set it as an empty string, but you need to set to "*".
Just add the code given below to set the OLLAMA_ORIGINS
:
import os
# File path
file_path = '/etc/systemd/system/ollama.service'
# Read the existing file content
with open(file_path, 'r') as file:
lines = file.readlines()
# Flag to check if [Service] section has been processed
in_service_section = False
# New content to be added
new_lines = []
environment_added = False
for line in lines:
if line.strip() == '[Service]':
in_service_section = True
new_lines.append(line)
continue
if in_service_section and line.strip() == '':
if not environment_added:
new_lines.append('Environment="OLLAMA_ORIGINS=*"\n')
environment_added = True
new_lines.append(line)
# If [Service] section is not found or Environment variable is not added, add it
if not environment_added:
new_lines.append('\n[Service]\n')
new_lines.append('Environment="OLLAMA_ORIGINS=*"\n')
# Write the changes back to the file
with open(file_path, 'w') as file:
file.writelines(new_lines)
print('Environment variable added successfully.')
Add the code just above the cell you shared and execute it, it will set the OLLAMA_ORIGINS
value. Don't forget to set OLLAMA_HOST
in Windows CLI at cmd:
set OLLAMA_HOST=<Link generated by ngrok>
Upvotes: 0