User Tools

Site Tools


ai:ollama-openwebui

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ai:ollama-openwebui [2024/07/12 13:11] Wulf Rajekai:ollama-openwebui [2024/08/08 17:31] (current) Wulf Rajek
Line 1: Line 1:
 ====== Ollama Open-Webui ====== ====== Ollama Open-Webui ======
 +
 +Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
  
 Notes only for now: Notes only for now:
Line 62: Line 64:
     environment:     environment:
       - WEBUI_NAME=CustomGPTName       - WEBUI_NAME=CustomGPTName
 +      - TZ=Europe/London
 +      - RAG_EMBEDDING_MODEL_TRUST_REMOTE_CODE=True # allow sentencetransformers to execute code like for alibaba-nlp/gte-large-en-v1.5
 +
 </code> </code>
  
Line 118: Line 123:
             capabilities: [gpu]             capabilities: [gpu]
 </code> </code>
 +
 +https://zohaib.me/extending-openwebui-using-pipelines/
 +
  
 Under settings->connections set: Under settings->connections set:
Line 258: Line 266:
 </code> </code>
  
-docker install - WSL2 backend +Create the respective docker volumes folder: 
-cmd line +<code> 
 +# p/Docker_Volumes = P:\Docker_Volumes 
 +mkdir P:\Docker_Volumes 
 +</code> 
 + 
 +docker install - choose the WSL2 backend 
 +cmd line  
 <code> <code>
 docker compose -f docker-openwebui.yml up -d docker compose -f docker-openwebui.yml up -d
Line 265: Line 280:
 </code> </code>
  
-mkdir 
-p/Docker_Volumes = P:\Docker_Volumes 
  
 to update all ollama models on windows, use this powershell command - adjust for the hostname/ip ollama is running on: to update all ollama models on windows, use this powershell command - adjust for the hostname/ip ollama is running on:
Line 275: Line 288:
 (Invoke-RestMethod http://localhost:11434/api/tags).Models.Name.ForEach{ docker exex -t ollama ollama pull $_ } (Invoke-RestMethod http://localhost:11434/api/tags).Models.Name.ForEach{ docker exex -t ollama ollama pull $_ }
 </code> </code>
 +
 +
 +====== Curl OpenAI API test ======
 +
 +<code>
 +curl http://localhost:11434/v1/chat/completions \
 +    -H "Content-Type: application/json" \
 +    -d '{
 +        "model": "llama3",
 +        "messages": [
 +            {
 +                "role": "system",
 +                "content": "You are a helpful assistant."
 +            },
 +            {
 +                "role": "user",
 +                "content": "Hello!"
 +            }
 +        ]
 +    }'
 +{"id":"chatcmpl-957","object":"chat.completion","created":1722601457,"model":"llama3","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"Hi there! It's great to meet you! I'm here to help with any questions or tasks you might have. What brings you to this virtual space today? Are you looking for recommendations, seeking answers to a specific question, or maybe looking for some inspiration? Let me know, and I'll do my best to assist you."},"finish_reason":"stop"}],"usage":{"prompt_tokens":23,"completion_tokens":68,"total_tokens":91}}
 +</code>
 +
ai/ollama-openwebui.1720786280.txt.gz · Last modified: 2024/07/12 13:11 by Wulf Rajek