dnsmichi.at
  • Home
  • About
  • Talks
  • Portfolio
  • All-remote Workspace
  • Readme

ollama

A collection of 2 posts
Raspberry Pi 5 with 52Pi NVMe hat, Samsung 990 EVO SSD - faster for Ollama and LLMs?
raspberry-pi Featured

Raspberry Pi 5 with 52Pi NVMe hat, Samsung 990 EVO SSD - faster for Ollama and LLMs?

This blog post explains setting up a Raspberry Pi 5 with an NVMe SSD hat for better disk performance. The goal is to compare the benchmark with the SD card Pi, and test the performance impact on running LLMs with Ollama.
05 May 2024 12 min read
Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context
ollama

Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context

Ollama allows to run large language models locally. I was excited to learn about Mixtral, an open model, now available through LLamaIndex with Ollama, explained in this blog tutorial. The example to load my own tweets to ask questions in this unique context is fascinating. After running Ollama, and using
11 Jan 2024 5 min read
Page 1 of 1
dnsmichi.at © 2025
Powered by Ghost