ollama¶
Description¶
Get up and running with Llama 3.1, Mistral, Gemma 2, and other large language models.
Environment Modules¶
Run module spider ollama
to find out what environment modules are available for this application.
Environment Variables¶
- HPC_OLLAMA_DIR - installation directory
Additional Usage Information¶
HiPerGator users should be aware that the State of Florida prohibits the use of DeepSeek models, like R1. Please consult the list of prohibited applications. This prohibition extends to HiPerGator and all state-owned devices.
Interactive OLLAMA use¶
Users need to start an interactive HiperGator Desktop session session on a GPU node at Open Ondemand and launch two terminals, one to start the ollama server and the other to chat with LLMs. Screen or tmux may also be used to activate two terminals.
In terminal 1, load the ollama module and start the server with either default or custom environmental settings:
-
Default settings
$ ml ollama $ ollama serve
-
Custom settings
$ ml ollama $ env {options} ollama serve (pass environmental variables to server). For example: set custom path to LLMs models: $ env OLLAMA_MODELS=/blue/group/$USER/ollama/models ollama serve
In terminal 2, pull a model and start chatting. For example, llama3.2:
$ ml ollama
$ ollama pull llama3.2
$ ollama run llama3.2
OLLAMA as a Slurm job¶
Example sbatch script:¶
- start ollama server with custom model path
- download the 'mistral' LLM
- use a personal langchain env to run a python script
#!/bin/bash
#SBATCH --job-name=ollama
#SBATCH --output=ollama_%j.log
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=8
#SBATCH --mem=20gb
#SBATCH --partition=gpu
#SBATCH --gpus=a100:1
#SBATCH --time=01:00:00
date;hostname;pwd
module load ollama
#add conda env with langchain to path
env_path=/my/conda/env/bin
export PATH=$env_path:$PATH
# start ollama server with custom model path and download the mistral model
env OLLAMA_MODELS=/blue/group/$USER/ollama/models ollama serve &
ollama pull mistral
# run python script
python my_ollama_python_script.py >> my_ollama_output.txt
Example python script:¶
from langchain_ollama import OllamaLLM
llm = OllamaLLM(model="mistral")
print(llm.invoke("why is the sky blue"))
Categories¶
LLM