add ollama
ci/woodpecker/push/woodpecker Pipeline was successful
Details
ci/woodpecker/push/woodpecker Pipeline was successful
Details
This commit is contained in:
parent
ac5fb71aa7
commit
476a2634ac
|
@ -97,9 +97,29 @@ Please carefully follow the terms and conditions at [Sandbox](https://sandbox.iu
|
||||||
- all my data is gone after the semester break: all persistent storages get recycled of every(!) user each semester break. Please backup your data locally if needed
|
- all my data is gone after the semester break: all persistent storages get recycled of every(!) user each semester break. Please backup your data locally if needed
|
||||||
- package conflicts: common issue is to install a unspecific library version, please specify or upgrade all dependencies manually.
|
- package conflicts: common issue is to install a unspecific library version, please specify or upgrade all dependencies manually.
|
||||||
|
|
||||||
|
## Large-Language-Models
|
||||||
|
An advanced Pytorch development environment is preinstalled with [Ollama](https://ollama.com/), which makes it easy to download and run different LLMs.
|
||||||
|
|
||||||
|
If you want to run Ollama including the [OpenWebUI](https://openwebui.com/), following commands need to be executed in multiple Terminal windows:
|
||||||
|
![terminal](res/sandbox-terminal.png "Terminal")
|
||||||
|
|
||||||
|
1. execute `ollama serve` to start the ollama backend
|
||||||
|
2. execute `ollama run mistral:7b` to download and run a specific model in a second terminal.
|
||||||
|
|
||||||
|
Following steps are only needed if you want to access the WebUI. It also showcases how other http services can temporally be tunneled and exposed to public.
|
||||||
|
![tunnel](res/sandbox-tunnel.png "Tunnel")
|
||||||
|
|
||||||
|
3. execute `open-webui serve` to serve the WebUI locally on port 8080 in another terminal.
|
||||||
|
4. visit [tun.iuk.hdm-stuttgart.de](https://tun.iuk.hdm-stuttgart.de) to obtain a token with your browser
|
||||||
|
5. execute `pgrok init --remote-addr tun.iuk.hdm-stuttgart.de:80 --forward-addr https://{user}.tun.iuk.hdm-stuttgart.de --token {token}` in the terminal. Replace `{user}` and `{token}` with your username and the previous obtained token.
|
||||||
|
6. execute `pgrok http 8080` to run the tunnel and expose the webui. Now you are able to access the webui at ´https://{user}.tun.iuk.hdm-stuttgart.
|
||||||
|
|
||||||
|
### Notes
|
||||||
|
Please use the tunnel only temporally and carefully. The tunnel only support http(s) tunnels. Tunneled services are public available and accessible by anyone! If you want to train/finetune any LLMs, please use the [Training Environment](sandbox/training.md) instead.
|
||||||
|
|
||||||
## Useful Links
|
## Useful Links
|
||||||
|
|
||||||
- [Jupyter Documentation](https://docs.jupyter.org/en/latest/)
|
- [Jupyter Documentation](https://docs.jupyter.org/en/latest/)
|
||||||
- [pip](https://pip.pypa.io/en/stable/user_guide/)
|
- [pip](https://pip.pypa.io/en/stable/user_guide/)
|
||||||
- [python](https://docs.python.org/3.11/)
|
- [python](https://docs.python.org/3.11/)
|
||||||
|
- [Ollama](https://ollama.com/)
|
Binary file not shown.
After Width: | Height: | Size: 12 KiB |
Binary file not shown.
After Width: | Height: | Size: 71 KiB |
Loading…
Reference in New Issue