Skip to main content

Models

Models

Models in Okareo are named endpoints with a bag of information like a prompt, name, and provider. Bespoke models, like an intent classifier for a RAG, can also be registered in Okareo as a model name and a custom endpoint.

Given the popularity of LLMs, models in Okareo are often the combination of model name, provider and prompt. If you are not experimenting with different foundational models and are just focused on prompt development and tuning, then the Okareo model facility becaomes largely a prompt versioning and management facility.

Each model version that Okareo maintains can be used to drive simulations and experiments. This gives you a way to compare specific behaviors using synthetically or statically defined scenarios to improve your prompt or make a better model selection.

Model Examples

Okareo works with a broad range of models through a dedicated provider class or through a generic model class. This includes generation, classification, embedding, and more.

Generation

Okareo supports all the major foundational model providers. To learn more about a specific provider, refer to the python and typescript sdks.

Here we see examples of OpenAI for python and typescript.

from okareo import Okareo
from okareo.model_under_test import OpenAIModel

model = okareo.register_model(
name="OpenAI Generation Model",
model=OpenAIModel(
model_id="gpt-3.5-turbo",
temperature=0,
system_prompt_template=
"You are an editor for AXIOS. Rephrase the requested text into a headline",
user_prompt_template=
"Here is the text to rephrase: {scenario_input}"
),
)

Embedding

Okareo can work with any embedding model through our custom model class. We additionally have unique built-in support for the specialized embedding models from Cohere.

To find a specific Cohere model ID, see the list of the Cohere embedding models here. For example, the model ID could be embed-english-v3.0.

from okareo import Okareo
from okareo.model_under_test import CohereModel

okareo.register_model(
name="Cohere embedding Model",
model=[
CohereModel(
model_type="embed",

# Add your model id here
model_id="your-model-here"
),
YourVectorDB()
]
)

Vector Databases

Okareo can work with any vector database through our custom model class. We additionally have built-in support for Qdrant and Pincone.

We have built in support to connect to a hosted Qdrant instance.

from okareo import Okareo
from okareo.model_under_test import QdrantDB

okareo.register_model(
name="Your retrieval model",
model=[
YourEmbeddingModel(),
QdrantDB(
# Your Qdrant instance url
url="...qdrant.io:port",

# Name of the collection within your Qdrant instance
collection_name="your collection name",

# How many top results should be returned from the vector search
top_k=10,
)
]
)

Classification Models

Okareo can work with any classification model through our custom model class. We additionally have built-in support for any of OpenAI's GPT models and Cohere's classification models.

To use an OpenAI model, find your OpenAI model id here. To understand the temperature parameter, you can learn about model temperature.

from okareo import Okareo
from okareo.model_under_test import OpenAIModel

okareo.register_model(
name="OpenAI Classifier Model",
model=OpenAIModel(
# add the model id from OpenAI, this can be found on their docs linked above
model_id="your-model-here",

# add the temperature you want for the model
temperature=0,

# This prompt template is sent as a system message to the language model
system_prompt_template="Classify whether or not the email is spam. Output True for spam and False for not spam ...",

# User prompt is sent as the user request to the language model.
# To use the scenario input in the user prompt,
# the user prompt should have "{scenario_input}" as part of the string
user_prompt_template="The user email: {scenario_input}",
),
)

Custom Models

Okareo can work with any model through the CustomModel class. To use this, create a class that inherits the abstract CustomModel class and implements the invoke method. See the various ways it can be implemented for classification, generation, embedding, and retrieval below.

from okareo import Okareo
from okareo.model_under_test import CustomModel

class CustomClassificationModel(CustomModel):
# Define the invoke method to be called on each input of a scenario
def invoke(self, input: str) -> tuple:

# call your model being tested using <input> from the scenario set
if "download now!" in input:
result = "spam"
else:
result = "not spam"

# return a tuple of (model result, overall model response context)
return result, {"response": "Classification successful" }

model_under_test = okareo.register_model(
name="intent_classifier",
model=CustomClassificationModel(name="Custom Classification model")
)