dnsmichi.at
  • Home
  • About
  • Talks
  • Portfolio
  • All-remote Workspace
  • Readme

mixtral

A collection of 1 post
Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context
ollama

Local Ollama running Mixtral LLM with LLamaIndex, loaded with personal tweet context

Ollama allows to run large language models locally. I was excited to learn about Mixtral, an open model, now available through LLamaIndex with Ollama, explained in this blog tutorial. The example to load my own tweets to ask questions in this unique context is fascinating. After running Ollama, and using
11 Jan 2024 5 min read
Page 1 of 1
dnsmichi.at © 2025
Powered by Ghost