dnsmichi.at
  • Home
  • About
  • Talks
  • Portfolio
  • All-remote Workspace
  • Readme

llm

A collection of 3 posts
Remote office upgrades 2024, part 2: Windows Gaming PC, KVM switch, 192 GB RAM for AI/LLMs
all-remote Featured

Remote office upgrades 2024, part 2: Windows Gaming PC, KVM switch, 192 GB RAM for AI/LLMs

2024 brought a new 49" monitor for efficient work but also gaming experiences. Buying a gaming PC, connecting it to the environment with a KVM switch, and then reconsider the Windows setup for containers, VMs and AI/LLMs was a fun challenge.
06 Jan 2025 12 min read
Raspberry Pi 5 with 52Pi NVMe hat, Samsung 990 EVO SSD - faster for Ollama and LLMs?
raspberry-pi Featured

Raspberry Pi 5 with 52Pi NVMe hat, Samsung 990 EVO SSD - faster for Ollama and LLMs?

This blog post explains setting up a Raspberry Pi 5 with an NVMe SSD hat for better disk performance. The goal is to compare the benchmark with the SD card Pi, and test the performance impact on running LLMs with Ollama.
05 May 2024 12 min read
Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context
ollama

Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context

Ollama allows to run large language models locally. I was excited to learn about Mixtral, an open model, now available through LLamaIndex with Ollama, explained in this blog tutorial. The example to load my own tweets to ask questions in this unique context is fascinating. After running Ollama, and using
11 Jan 2024 5 min read
Page 1 of 1
dnsmichi.at © 2025
Powered by Ghost