How To Setup Ollama LLM AI Chatbot With Open WebUI, An Open-source AI User Interface On Ubuntu VPS
Science & Technology
Introduction
In this article, we'll guide you through the process of setting up the Ollama LLM AI chatbot using Open WebUI, an open-source AI user interface, on an Ubuntu VPS. We will use Digital Ocean as our server host provider, utilizing their one-click droplet installation marketplace to simplify the installation of Ollama and Open WebUI. The entire setup process will be nearly plug-and-play once we create our droplet.
Once we’re up and running, we will explore how to add users to your self-hosted Open WebUI instance, allowing friends and family to access it. Additionally, we will cover how to incorporate community models from Open WebUI's website and how to integrate Ollama models into your instance.
Setting Up Your VPS
Create a Digital Ocean Account: Navigate to this Digital Ocean link that offers $ 200 in free credits for 60 days. You can create an account using your email, GitHub, or Google account.
Create a Droplet:
- After signing in, go to your Digital Ocean project dashboard.
- Click on the green "Create" button and then choose “Droplets”.
- Select your preferred region. For this demonstration, we will choose "Singapore".
- Scroll to "Choose an image", and switch to the "Marketplace" tab. Search for and select "Ollama with Open WebUI".
Select Droplet Size:
- Choose the "Shared CPU" droplet type and select a plan. We recommend the premium AMD plan with:
- 4 GB RAM
- 2 AMD CPUs
- 80 GB NVMe SSD
- 4 TB bandwidth.
- Choose the "Shared CPU" droplet type and select a plan. We recommend the premium AMD plan with:
Configure Authentication:
- Set your authentication method to "Password" and create a secure root password.
Name Your Droplet:
- Give your droplet a unique name like "Ollama_Open_WebUI" and click on "Create Droplet".
Once the droplet is set up, you will be able to access your Open WebUI dashboard.
Accessing Open WebUI
Access the Dashboard:
- Copy the IP address of your droplet.
- Open a new browser tab and enter
http://<your_droplet_ip>
(ensure it's HTTP, not HTTPS). - You’ll be greeted by the Open WebUI sign-in page.
Sign Up:
- Click on the "Sign Up" option and enter your details. It's essential that the first person signing up becomes the admin account.
Interacting with the Pre-installed Model
After signing up, you can interact with the model (Tiny Llama). Choose the model from the dropdown menu and send it a prompt. For instance, typing "Give me 10 pasta dish recipes" will yield the model's response.
Adding Users
To enable others to use your self-hosted Open WebUI instance:
- Go to your profile and click on the "Admin Panel".
- Click on "Add User", choose a role, and fill in the user details (name, email, password).
- Users can sign up directly themselves, and you can manage their roles in the admin panel.
Adding Community Models
You can enhance your experience by integrating community models:
- Click on "Workspace" at the top left corner and look for the option to discover community models.
- Search through the available models and select one to add.
Adding Ollama Base Models
- Go back to the admin panel and click on "Settings".
- Under "Models", you'll find an option to pull a model from Ollama.com. Follow the provided link to browse available models.
- Select a lightweight model (for instance, mral:0.5b or similar), copy the model tag, and paste it into the input field before pulling the model.
Final Interactions
You can now interact with all added models. Set up new chats and engage with your models. Use the three-dot menu to delete chats when necessary.
By following these structured steps, you can successfully set up your Ollama LLM AI chatbot and enjoy the flexibility of a self-hosted AI environment.
Keyword
- Ollama
- LLM
- AI Chatbot
- Open WebUI
- Ubuntu VPS
- Digital Ocean
- Self-hosted
- Community Models
- Users Management
- Admin Panel
FAQ
Q1: What is Ollama LLM?
A1: Ollama LLM is an AI language model designed for natural language processing tasks.
Q2: Why should I use Digital Ocean for hosting?
A2: Digital Ocean provides user-friendly interfaces, one-click installations, and an extensive support community, making it ideal for hosting applications like Ollama.
Q3: Can multiple users access the self-hosted instance?
A3: Yes, you can create accounts for multiple users and manage their access through the admin panel.
Q4: What types of models can I add to Open WebUI?
A4: You can add both community models and Ollama base models to enrich your chatbot’s capabilities.
Q5: How do I troubleshoot model errors?
A5: If you encounter an error like "Ollama 500 internal server error", it usually indicates that the model you’re trying to use is too resource-intensive, and you may need to choose a lightweight alternative.