Project:LLMQueryService POC: Difference between revisions
From MaRDI portal
No edit summary |
No edit summary |
||
Line 7: | Line 7: | ||
## apt-get install git python3-pip python3-venv | ## apt-get install git python3-pip python3-venv | ||
# Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest ) | # Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest ) | ||
# Install Ollama ( https://ollama.com/download/linux ) | |||
## Check the logs: journalctl -u ollama -f |
Revision as of 19:49, 10 July 2024
This page describes how to install the proof-of-concept LLM-based query service.
Using a OpenStack VM
- Create a new instance (if you use Debian: at least 12)
- Install necessary libraries
- apt-get update
- apt-get install git python3-pip python3-venv
- Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest )
- Install Ollama ( https://ollama.com/download/linux )
- Check the logs: journalctl -u ollama -f