Maxim now supports over 6500 models through Ollama, enabling you to test prompts and workflows on your local machine with ease. Key benefits:
- Local testing: Run evaluations locally using Ollama with enhanced data privacy without relying on cloud uploads.
- Easy setup: Quickly enable the Ollama provider and use the model of your choice by going to Settings ➡️ Models ➡️ Ollama. Learn more.
- Open WebUI support: Run and interact with LLMs entirely offline or within your privately hosted environment