Ollama Tool Support and Call Interception Using MITM Proxy

Enhance Your AI Models with Ollama Tool Support and API Call Interception Using MITM Proxy — A Step-by-Step Guide

Philippe Bogaerts
6 min readAug 9, 2024

Introduction

Ollama (https://ollama.com) has recently added support for tool calling with popular models like Llama 3.1. This feature allows models to use external tools to answer prompts, enabling more complex tasks and interactions with the outside world. (https://ollama.com/blog/tool-support)

Setting up Ollama

Setting up and running Ollama is straightforward. Simply follow the instructions on their website. Once configured, you can interact with the Ollama server using the command-line interface (CLI).

To pull a model from the hub:

ollama pull llama3.1

This command downloads the model if it’s not already installed. Depending on the model’s size, this process may take some time. For a list of available models, visit https://ollama.com/models.

Note: Ensure that the model supports tool calling.

Once downloaded, you can interact with the model:

ollama run llama3.1
>>> What is AI ?
Artificial Intelligence (AI)…

--

--

Philippe Bogaerts

#BruCON co-founder, #OWASP supporter, Application Delivery and Web Application Security, #Kubernetes and #container, #pentesting enthousiast, BBQ & cocktails !!