3

I am following the steps shown here, however I am unable to get the output:

>>> starting ollama serve
Couldn't find '/root/.ollama/id_ed25519'. Generating new private key.
Your new public key is:

ssh-ed25519 {some key}

2024/01/16 20:19:11 images.go:808: total blobs: 0
2024/01/16 20:19:11 images.go:815: total unused blobs removed: 0
2024/01/16 20:19:11 routes.go:930: Listening on 127.0.0.1:11434 (version 0.1.20)

But, I am getting only the following output

>>> starting ollama serve 
time=2024-05-03T07:49:48.426Z level=INFO source=images.go:828 msg="total blobs: 0"
time=2024-05-03T07:49:48.426Z level=INFO source=images.go:835 msg="total unused blobs removed: 0"
time=2024-05-03T07:49:48.426Z level=INFO source=routes.go:1071 msg="Listening on 127.0.0.1:11434 (version 0.1.33)"
time=2024-05-03T07:49:48.427Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama3694758245/runners

Also when running ollama in the terminal I am getting this error: Error: something went wrong, please see the ollama server logs for details

Any idea what might be wrong?

PS: I tried both examples shown in here

1

1 Answer 1

1

Since, google colab can not run multiple cells simultaneously, so we need to run ollama serve as a background process within the Colab environment.

This can be achieved by using subprocesses and asynchronous execution.

async def run_ollama():
    process = subprocess.Popen("ollama serve", 
        shell=True, 
        stdout=subprocess.PIPE, 
        stderr=subprocess.PIPE
    )
    while True:
        await asyncio.sleep(1)

Use this colab file, and more resources.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.