From Newsgroup: alt.2600
Personal opinion: The video gives a good idea, in why not to use propietary/closes LLM's models, but also he is not a hacker/programmer
so he uses python, but I have done this on Rust, but you can do it also
on GO/C etc...
---
Using Ollama to run local LLM's, in this video I show you how to code a
local AI web search assistant. Having an AI that can use the web to
respond with up to date context, is not something that requires ChatGPT. Whether you have no programming experience or write code in your sleep,
this video will give you everything you need to know to build version
1.0 of your Local Ollama AI web search agent. The program we build in
this video will be 100% in the Python programming language.
-
https://inv.nadeko.net/watch?v=9KKnNh89AGU
Happy Hacking
ReK2
--
- {gemini,https}://{,rek2.}hispagatos.org - mastodon: @
rek2@hispagatos.space
- [https|gemini]://2600.Madrid -
https://hispagatos.space/@rek2
-
https://keyoxide.org/A31C7CE19D9C58084EA42BA26C0B0D11E9303EC5
--- Synchronet 3.20a-Win32 NewsLink 1.114