User Tools

Site Tools


ai:ollama-openwebui

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ai:ollama-openwebui [2024/07/11 13:31] Wulf Rajekai:ollama-openwebui [2024/08/08 17:31] (current) Wulf Rajek
Line 1: Line 1:
 ====== Ollama Open-Webui ====== ====== Ollama Open-Webui ======
 +
 +Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
  
 Notes only for now: Notes only for now:
Line 62: Line 64:
     environment:     environment:
       - WEBUI_NAME=CustomGPTName       - WEBUI_NAME=CustomGPTName
 +      - TZ=Europe/London
 +      - RAG_EMBEDDING_MODEL_TRUST_REMOTE_CODE=True # allow sentencetransformers to execute code like for alibaba-nlp/gte-large-en-v1.5
 +
 </code> </code>
  
Line 118: Line 123:
             capabilities: [gpu]             capabilities: [gpu]
 </code> </code>
 +
 +https://zohaib.me/extending-openwebui-using-pipelines/
 +
  
 Under settings->connections set: Under settings->connections set:
Line 203: Line 211:
 <code> <code>
 docker exec -ti ollama ollama pull imagename:tag docker exec -ti ollama ollama pull imagename:tag
 +</code>
 +
 +To update all previously pulled ollama models, use this bash script:
 +<code bash update-ollama-models.sh> 
 +#!/bin/bash
 +
 +docker exec -ti ollama ollama list | tail -n +2 | awk '{print $1}' | while read -r model; do
 +  echo "Updating model: $model..."
 +  docker exec -t ollama ollama pull $model
 +  echo "--"
 +done
 +echo "All models updated."
 </code> </code>
  
Line 246: Line 266:
 </code> </code>
  
-docker install - WSL2 backend +Create the respective docker volumes folder: 
-cmd line +<code> 
 +# p/Docker_Volumes = P:\Docker_Volumes 
 +mkdir P:\Docker_Volumes 
 +</code> 
 + 
 +docker install - choose the WSL2 backend 
 +cmd line  
 <code> <code>
 docker compose -f docker-openwebui.yml up -d docker compose -f docker-openwebui.yml up -d
Line 253: Line 280:
 </code> </code>
  
-mkdir + 
-p/Docker_Volumes P:\Docker_Volumes+to update all ollama models on windows, use this powershell command - adjust for the hostname/ip ollama is running on: 
 +<code powershell> 
 +(Invoke-RestMethod http://localhost:11434/api/tags).Models.Name.ForEach{ ollama pull $_ } 
 + 
 +#or if in docker 
 +(Invoke-RestMethod http://localhost:11434/api/tags).Models.Name.ForEach{ docker exex -t ollama ollama pull $_ } 
 +</code> 
 + 
 + 
 +====== Curl OpenAI API test ====== 
 + 
 +<code> 
 +curl http://localhost:11434/v1/chat/completions \ 
 +    -H "Content-Type: application/json"
 +    -d '{ 
 +        "model": "llama3", 
 +        "messages":
 +            { 
 +                "role": "system", 
 +                "content": "You are a helpful assistant." 
 +            }, 
 +            { 
 +                "role": "user", 
 +                "content": "Hello!" 
 +            } 
 +        ] 
 +    }' 
 +{"id":"chatcmpl-957","object":"chat.completion","created":1722601457,"model":"llama3","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"Hi there! It's great to meet you! I'm here to help with any questions or tasks you might have. What brings you to this virtual space today? Are you looking for recommendations, seeking answers to a specific question, or maybe looking for some inspiration? Let me know, and I'll do my best to assist you."},"finish_reason":"stop"}],"usage":{"prompt_tokens":23,"completion_tokens":68,"total_tokens":91}} 
 +</code> 
ai/ollama-openwebui.1720701068.txt.gz · Last modified: 2024/07/11 13:31 by Wulf Rajek