Skip to main content

Models

Register the model that you want to evaluate, test or collect datapoints from. Models must be uniquely named within a project namespace.

warning

The first time a model is defined, the attributes of the model are persisted. Subsequent calls to register_model will return the persisted model. They will not update the definition.

For custom models, subsequent calls to register_model will need the CustomModel class to be initialized again.

Custom Models

Okareo can work with any model through the CustomModel class. To use this, create a class that inherits the abstract CustomModel class and implements the invoke method. See the various ways it can be implemented for classification, generation, embedding, and retrieval below.

from okareo import Okareo
from okareo.model_under_test import CustomModel

class CustomClassificationModel(CustomModel):
# Define the invoke method to be called on each input of a scenario
def invoke(self, input: str) -> tuple:

# call your model being tested using <input> from the scenario set
if "download now!" in input:
result = "spam"
else:
result = "not spam"

# return a tuple of (model result, overall model response context)
return result, {"response": "Classification successful" }

model_under_test = okareo.register_model(
name="intent_classifier",
model=CustomClassificationModel(name="Custom Classification model")
)

Classification Models

Okareo can work with any classification model through our custom model class. We additionally have built-in support for any of OpenAI's GPT models and Cohere's classification models.

To use an OpenAI model, find your OpenAI model id here. To understand the temperature parameter, you can learn about model temperature.

from okareo import Okareo
from okareo.model_under_test import OpenAIModel

okareo.register_model(
name="OpenAI Classifier Model",
model=OpenAIModel(
# add the model id from OpenAI, this can be found on their docs linked above
model_id="your-model-here",

# add the temperature you want for the model
temperature=0,

# This prompt template is sent as a system message to the language model
system_prompt_template="Classify whether or not the email is spam. Output True for spam and False for not spam ...",

# User prompt is sent as the user request to the language model.
# To use the scenario input in the user prompt,
# the user prompt should have "{scenario_input}" as part of the string
user_prompt_template="The user email: {scenario_input}",
),
)

Generation Models

Okareo can work with any generative model through our custom model class. We additionally have built-in support for any GPT models from OpenAI.

To use an OpenAI model, find your OpenAI model id here. To understand the temperature parameter, you can learn about model temperature.

from okareo import Okareo
from okareo.model_under_test import OpenAIModel

okareo.register_model(
name="OpenAI Generation Model",
model=OpenAIModel(
# add the model id from OpenAI, this can be found on their docs linked above
model_id="your-model-here",

# add the temperature you want for the model
temperature=0,

# This prompt template is sent as a system message to the language model
system_prompt_template="Rephrase this piece of text ...",

# User prompt is sent as the user request to the language model.
# To use the scenario input in the user prompt,
# the user prompt should have "{scenario_input}" as part of the string
user_prompt_template="Here is the text to rephrase: {scenario_input}"
),
)

Embedding Models

Okareo can work with any embedding model through our custom model class. We additionally have built-in support for Cohere embedding models.

To find a model ID, see the list of the Cohere embedding models here. For example, the model ID could be embed-english-v3.0.

from okareo import Okareo
from okareo.model_under_test import CohereModel

okareo.register_model(
name="Cohere embedding Model",
model=[
CohereModel(
model_type="embed",

# Add your model id here
model_id="your-model-here"
),
YourVectorDB()
]
)

Vector Databases

Okareo can work with any vector database through our custom model class. We additionally have built-in support for Qdrant and Pincone.

We have built in support to connect to a hosted Qdrant instance.

from okareo import Okareo
from okareo.model_under_test import QdrantDB

okareo.register_model(
name="Your retrieval model",
model=[
YourEmbeddingModel(),
QdrantDB(
# Your Qdrant instance url
url="...qdrant.io:port",

# Name of the collection within your Qdrant instance
collection_name="your collection name",

# How many top results should be returned from the vector search
top_k=10,
)
]
)