
Farfalle vs Perplexica
Comparing two self-hosted AI search engines
Awesome food is the pleasure for your eyes too. But in this post we will compare two AI-based search systems, Farfalle and Perplexica.
Comparing two self-hosted AI search engines
Awesome food is the pleasure for your eyes too. But in this post we will compare two AI-based search systems, Farfalle and Perplexica.
Running copilot-style service locally? Easy!
That’s very exciting! Instead of calling copilot or perplexity.ai and telling all the world what you are after, you can now host similar service on your own PC or laptop!
Testing logical fallacy detection
Recently we have seen several new LLMs were released. Exciting times. Let’s test and see how they perform when detecting logical fallacies.
Not so many to choose from but still....
When I started experimenting with LLMs the UIs for them were in active development and now some of them are really good.
Requires some experimenting but
Still there are some common approaches how to write good prompts so LLM would not get confused trying to understand what you wand from it.
8 llama3 (Meta+) and 5 phi3 (Microsoft) LLM versions
Testing how models with different number of parameters and quantization are behaving.
Ollama LLM model files take a lot of space
After installing ollama better to reconfigure ollama to store them in new place right away. So after we pull a new model, it doesn’t get downloaded to the old location.