Skip to main content

Ollama Integration Guide

Integrate Ollama to leverage the power of AI Assist, a tool designed to generate and improve text, making your content creation process more efficient and effective. Ollama allows you to run large language models locally on your own hardware.

1. Setting Up

To use Ollama, you need to have Ollama installed and running on your server. If you haven't set it up yet, visit the Ollama website for installation instructions.

  1. Navigate to Admin Console -> Services
  2. Click on Ollama.
  3. Insert the API URL of your Ollama instance (e.g., http://localhost:11434)
  4. Make sure the Active label is turned on.
  5. Save the changes.
Screenshots:
Ollama Service Form

2. Configuring AI Assist

  1. Navigate to Admin Console -> Settings -> AI
  2. In the AI settings area, select your AI provider, typically by choosing Ollama from the available options.
  3. Select the Model you want to use.
  4. You can enter global system instructions that you want the AI assistant to follow. Learn more.
Screenshots:
AI Settings

3. Done

By following these steps, you can successfully set up Ollama and utilize the AI Assist feature on your Mixpost instance.