我在 Kaggle 笔记本(Linux)上下载了 ollama。我想使用 python 脚本与它交互。按照 github 存储库上的说明并运行:
ollama run llama3
我得到了输出:Error: could not connect to ollama app, is it running?
。
看来我需要在运行 llama3 之前运行
ollama serve
。然而,整个主线程都被 ollama serve
占据,所以你不能在它之后运行任何其他东西。
我尝试过的解决方法:
ollama serve &
返回OSError: Background processes not supported.
subprocess.run('ollama', 'serve')
通过 python 运行它,它返回 TypeError: bufsize must be an integer
第二种方法的完整日志:
TypeError Traceback (most recent call last)
Cell In[29], line 1
----> 1 subprocess.run('ollama', 'serve')
File /opt/conda/lib/python3.10/subprocess.py:503, in run(input, capture_output, timeout, check, *popenargs, **kwargs)
500 kwargs['stdout'] = PIPE
501 kwargs['stderr'] = PIPE
--> 503 with Popen(*popenargs, **kwargs) as process:
504 try:
505 stdout, stderr = process.communicate(input, timeout=timeout)
File /opt/conda/lib/python3.10/subprocess.py:780, in Popen.__init__(self, args, bufsize, executable, stdin, stdout, stderr, preexec_fn, close_fds, shell, cwd, env, universal_newlines, startupinfo, creationflags, restore_signals, start_new_session, pass_fds, user, group, extra_groups, encoding, errors, text, umask, pipesize)
778 bufsize = -1 # Restore default
779 if not isinstance(bufsize, int):
--> 780 raise TypeError("bufsize must be an integer")
782 if pipesize is None:
783 pipesize = -1 # Restore default
TypeError: bufsize must be an integer
我选择 ollama 因为设置很简单(只需运行一个命令)。除了 ollama 之外,我可以使用另一种方法,但是我想在 python 上运行它,而不需要太多设置,因为它在 kaggle 上运行
这就是我为了让它发挥作用所做的:
#Download ollama
!curl -fsSL https://ollama.com/install.sh | sh
import subprocess
process = subprocess.Popen("ollama serve", shell=True) #runs on a different thread
#Download model
!ollama pull llama3
!pip install ollama
import ollama
#Then everytime you want to chat
response = ollama.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])