Project:LLMQueryService POC: Difference between revisions

From MaRDI portal
No edit summary
No edit summary
Line 7: Line 7:
## apt-get install git python3-pip python3-venv
## apt-get install git python3-pip python3-venv
# Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest )
# Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest )
# Install Ollama ( https://ollama.com/download/linux )
## Check the logs: journalctl -u ollama -f

Revision as of 20:49, 10 July 2024

This page describes how to install the proof-of-concept LLM-based query service.

Using a OpenStack VM

  1. Create a new instance (if you use Debian: at least 12)
  2. Install necessary libraries
    1. apt-get update
    2. apt-get install git python3-pip python3-venv
  3. Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest )
  4. Install Ollama ( https://ollama.com/download/linux )
    1. Check the logs: journalctl -u ollama -f