Skip to contents

set_num_ctx lets you define the length of the context to be supported by the model

Usage

set_num_ctx(workflow_obj, num_ctx)

Arguments

workflow_obj

A workflow object containing all parameters describing the workflow required

num_ctx

a numerical value defining the length of the context to be used, in tokens.

Value

a workflow object with the new added num_ctx parameter

Details

Depending on the server settings, the length of the context handled by the model may be shorter than you expect. For example, Ollama seems to default to a context size of 1024 tokens even if the model actually supports more. In order to be able to fully use the capabilities of the model, you can specify the length that you expect it to support with num_ctx.

Examples

my_workflow <- ai_workflow() |> 
set_model(model_name="llama3:8b-instruct-q5_0") |> 
set_num_ctx(num_ctx=2048)