Skip to main content

Self Hosted Proxy Configuration

The Okareo Proxy is available as a hosted cloud service or can be self managed locallly or in your managed cloud. The proxy itself is lightweight and natively uses OpenTelemetry to communicate with Okareo. This means that you can use the self hosted proxy with the Okareo cloud or with a self-hosted Okareo environment. See Okareo Proxy to read more about the proxy capabilities and how to use it.

Self-Hosted Proxy via CLI

The self-hosted Okareo proxy is part of the Okareo CLI and is functionally equivalent to the cloud hosted version.

Starting the Proxy

To start the self-hosted proxy, use the following command:

% okareo proxy [flags]
tip

The Okareo CLI is available for Linux, MacOS and Windows. See Okareo CLI installation instuctions to learn more.

Available Options

The proxy command on the Okareo CLI provides a number of options to configure proxy model access and environment

FlagDescriptionExample
-c, --config stringPath to the config filedefault: ./cmd/proxy_config.yaml
-d, --debugEnable debug mode
-h, --helpDisplay help for the proxy command
-H, --host stringHost to run the proxy server ondefault: 0.0.0.0
-m, --model stringSpecify the default model to usegpt-3.5-turbo, gemini/gemini-1.5-pro-002
-p, --port stringPort to run the proxy server ondefault: 4000

Example Command

To start the proxy on port 4000 with a specific configuration file, you would run:

okareo proxy -c ./my_config.yaml -p 4000

Using the Proxy

When using the self-hosted proxy, ensure that your model's base_url points to the proxy endpoint. For example, if running locally, set it to http://my.hosted.service:4000.

from openai import OpenAI

openai = OpenAI(
base_url="http://my.hosted.service:4000",
default_headers={"api-key": "<OKAREO_API_KEY>"},
api_key="<YOUR_LLM_PROVIDER_KEY>")