loader image

Reply To: Set up a Local AI like ChatGPT on your own machine!

What makes us different from other similar websites? Forums Tech Set up a Local AI like ChatGPT on your own machine! Reply To: Set up a Local AI like ChatGPT on your own machine!

#8418
thumbtak
Moderator

Here is a script to auto-start the server with bash in the terminal …

#!/bin/bash

# =================================================================
# LLM STARTUP SCRIPT (Ollama + Open WebUI)
# =================================================================
# This script manages the startup of your LLM server from an
# external drive. It handles path configuration, process management,
# and automatic updates.
# =================================================================

CONFIG_FILE="$HOME/.llm_server_config"

# --- 1. SET OR RETRIEVE INSTALL_LOCATION ---
if [ ! -f "$CONFIG_FILE" ] || [ -z "$(grep 'INSTALL_LOCATION=' "$CONFIG_FILE" | cut -d'=' -f2)" ]; then
echo "-------------------------------------------------------"
echo "FIRST TIME SETUP"
echo "-------------------------------------------------------"
echo "Please enter the full path to your external drive install location."
echo "Example: /mnt/657e2d3e-e162-46df-b259-bbbbc37807df"
read -p "Location: " USER_INPUT

USER_INPUT=$(echo "$USER_INPUT" | sed 's:/*$::' | xargs)

if [ -z "$USER_INPUT" ]; then
echo "Error: No path entered. Exiting."
exit 1
fi

echo "INSTALL_LOCATION=$USER_INPUT" > "$CONFIG_FILE"
INSTALL_LOCATION="$USER_INPUT"
echo "Successfully saved location to $CONFIG_FILE"
else
INSTALL_LOCATION=$(grep 'INSTALL_LOCATION=' "$CONFIG_FILE" | cut -d'=' -f2)
fi

if [ -z "$INSTALL_LOCATION" ]; then
echo "Error: INSTALL_LOCATION could not be determined. Delete $CONFIG_FILE and try again."
exit 1
fi

echo "Using Install Location: $INSTALL_LOCATION"

# --- 2. DEFINE PATHS ---
OLLAMA_PATH="$INSTALL_LOCATION/ollama_models"
DATA_PATH="$INSTALL_LOCATION/openwebui_data"
ENV_PATH="$INSTALL_LOCATION/openwebui_env"

# --- 3. EXPORT ENVIRONMENT VARIABLES ---
export OLLAMA_MODELS="$OLLAMA_PATH"
export DATA_DIR="$DATA_PATH"

# --- 4. START OLLAMA ---
echo "Checking if Ollama is already running..."
if pgrep -x "ollama" > /dev/null; then
echo "Ollama is already active."
else
echo "Starting Ollama service (pointing to $OLLAMA_PATH)..."
ollama serve > /dev/null 2>&1 &
sleep 2
fi

# --- 5. SETUP CONDA FOR SCRIPT ---
CONDA_BASE=$(conda info --base 2>/dev/null || echo "$HOME/miniconda3")
CONDA_SH="$CONDA_BASE/etc/profile.d/conda.sh"

if [ -f "$CONDA_SH" ]; then
source "$CONDA_SH"
conda activate "$ENV_PATH"
else
echo "Warning: Could not find conda.sh. Attempting PATH fallback..."
export PATH="$ENV_PATH/bin:$PATH"
fi

# --- 6. AUTO-UPDATE CHECK ---
# This ensures you are always on the latest version of Open WebUI
echo "-------------------------------------------------------"
echo "Checking for Open WebUI updates..."
pip install --upgrade open-webui
echo "-------------------------------------------------------"

# Final check: Does the open-webui command exist?
if ! command -v open-webui &> /dev/null; then
echo "Error: 'open-webui' command not found."
echo "Verify it is installed in: $ENV_PATH"
exit 1
fi

echo "Launching Open WebUI server..."
echo "Access your server at: http://localhost:8080"
echo "-------------------------------------------------------"
echo "Press Ctrl+C to stop the server."
echo "-------------------------------------------------------"

# Launch Open WebUI
open-webui serve

If the location doesn’t work, you might have a trailing /. To fix this and put in a new location, run $ rm ~/.llm_server_config before running the bash again.

  • This reply was modified 1 month ago by thumbtak.
  • This reply was modified 1 month ago by thumbtak.
  • This reply was modified 1 month ago by thumbtak. Reason: Added an update for webui
  • This reply was modified 1 month ago by thumbtak. Reason: Corrected an error in the script, on line 92: unexpected EOF while looking for matching `"'
TAKs Shack