mirror of
https://github.com/open-webui/open-webui.git
synced 2025-12-16 03:47:49 +01:00
feat: enable buildtime API_ENDPOINT env var
This commit is contained in:
committed by
AJ ONeal
parent
f4f1283cd5
commit
86395a8c1f
18
README.md
18
README.md
@@ -7,15 +7,25 @@ ChatGPT-Style Web Interface for Ollama 🦙
|
||||
## Features ⭐
|
||||
|
||||
- 🖥️ **Intuitive Interface**: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience.
|
||||
|
||||
- 📱 **Responsive Design**: Enjoy a seamless experience on both desktop and mobile devices.
|
||||
|
||||
- ⚡ **Swift Responsiveness**: Enjoy fast and responsive performance.
|
||||
|
||||
- 🚀 **Effortless Setup**: Install seamlessly using Docker for a hassle-free experience.
|
||||
|
||||
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
|
||||
|
||||
- 📜 **Chat History**: Effortlessly access and manage your conversation history.
|
||||
|
||||
- 📤📥 **Import/Export Chat History**: Seamlessly move your chat data in and out of the platform.
|
||||
|
||||
- ⚙️ **Fine-Tuned Control with Advanced Parameters**: Gain a deeper level of control by adjusting parameters such as temperature and defining your system prompts to tailor the conversation to your specific preferences and needs.
|
||||
|
||||
- 💻 **Code Syntax Highlighting**: Enjoy enhanced code readability with our syntax highlighting feature.
|
||||
- 🔗 **External Ollama Server Connection**: Link to the model when Ollama is hosted on a different server via the environment variable -e OLLAMA_ENDPOINT="http://[insert your Ollama address]".
|
||||
|
||||
- 🔗 **External Ollama Server Connection**: You can seamlessly connect to an external Ollama server hosted on a different address by setting the environment variable during the Docker build process. Execute the following command to include the Ollama API endpoint in the Docker image: `docker build --build-arg OLLAMA_API_ENDPOINT="http://[Your Ollama URL]/api" -t ollama-webui .`.
|
||||
|
||||
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.
|
||||
|
||||
## How to Install 🚀
|
||||
@@ -40,7 +50,7 @@ OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
|
||||
|
||||
```bash
|
||||
docker build -t ollama-webui .
|
||||
docker run -d -p 3000:3000 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ollama-webui
|
||||
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui
|
||||
```
|
||||
|
||||
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
||||
@@ -50,8 +60,8 @@ Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localh
|
||||
If Ollama is hosted on a server other than your local machine, you can connect to it using the following environment variable:
|
||||
|
||||
```bash
|
||||
docker build -t ollama-webui .
|
||||
docker run -d -p 3000:3000 --add-host=host.docker.internal:host-gateway -e OLLAMA_ENDPOINT="http://[insert your ollama url]" --name ollama-webui --restart always ollama-webui
|
||||
docker build --build-arg OLLAMA_API_ENDPOINT="http://[Your Ollama URL]/api" -t ollama-webui .
|
||||
docker run -d -p 3000:8080 --name ollama-webui --restart always ollama-webui
|
||||
```
|
||||
|
||||
## What's Next? 🚀
|
||||
|
||||
Reference in New Issue
Block a user