Self Hosted Proxy Configuration
The Okareo Proxy is available as a hosted cloud service or can be self managed locallly or in your managed cloud. The proxy itself is lightweight and natively uses OpenTelemetry to communicate with Okareo. This means that you can use the self hosted proxy with the Okareo cloud or with a self-hosted Okareo environment. See Okareo Proxy to read more about the proxy capabilities and how to use it.
Self-Hosted Proxy via CLI
The self-hosted Okareo proxy is part of the Okareo CLI and is functionally equivalent to the cloud hosted version.
Starting the Proxy
To start the self-hosted proxy, use the following command:
% okareo proxy [flags]
The Okareo CLI is available for Linux, MacOS and Windows. See Okareo CLI installation instuctions to learn more.
Available Options
The proxy
command on the Okareo CLI provides a number of options to configure proxy model access and environment
Flag | Description | Example |
---|---|---|
-c, --config string | Path to the config file | default: ./cmd/proxy_config.yaml |
-d, --debug | Enable debug mode | |
-h, --help | Display help for the proxy command | |
-H, --host string | Host to run the proxy server on | default: 0.0.0.0 |
-m, --model string | Specify the default model to use | gpt-3.5-turbo , gemini/gemini-1.5-pro-002 |
-p, --port string | Port to run the proxy server on | default: 4000 |
Example Command
To start the proxy on port 4000 with a specific configuration file, you would run:
okareo proxy -c ./my_config.yaml -p 4000
Using the Proxy
When using the self-hosted proxy, ensure that your model's base_url
points to the proxy endpoint. For example, if running locally, set it to http://my.hosted.service:4000
.
- Python
- TypeScript
from openai import OpenAI
openai = OpenAI(
base_url="http://my.hosted.service:4000",
default_headers={"api-key": "<OKAREO_API_KEY>"},
api_key="<YOUR_LLM_PROVIDER_KEY>")
import OpenAI from "openai";
const openai = new OpenAI({
baseURL: "http://my.hosted.service:4000",
defaultHeaders: { "api-key": "<OKAREO_API_KEY>" },
apiKey: "<YOUR_LLM_PROVIDER_KEY>",
});