dnsmichi.at
  • Home
  • About
  • Talks
  • Portfolio
  • All-remote Workspace
  • Readme

container

A collection of 2 posts
Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context
ollama

Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context

Ollama allows to run large language models locally. I was excited to learn about Mixtral, an open model, now available through LLamaIndex with Ollama, explained in this blog tutorial. The example to load my own tweets to ask questions in this unique context is fascinating. After running Ollama, and using
11 Jan 2024 5 min read
Upgrade to Ghost v4 with Docker Compose
ghost

Upgrade to Ghost v4 with Docker Compose

When Ghost 4 [https://ghost.org/changelog/4/] was released earlier this year, I've looked into the upgrade from the initial v3 based setup in docker-compose [https://dnsmichi.at/2020/02/09/new-blog/]. Unfortunately, the container based setup is not officially supported, and there were no upgrade guides
30 Nov 2021 3 min read
Page 1 of 1
dnsmichi.at © 2025
Powered by Ghost