Skip to contents

request_json_answer request the current workflow to answer using a JSON format.

Usage

request_json_answer(workflow_obj, json_object_format = list())

Arguments

workflow_obj

the current workflow object that you want to build on

json_object_format

the format required for the answer.

Details

This function will request the LLM to answer using a JSON format. By default it will simply focus on a simple JSON format with just answer as a single object. You can add a different JSON format as an argument.

Examples

myflow_template <- ai_workflow() |> 
set_connector("ollama")  |> 
  set_model(model_name= "llama3.1:8b-instruct-q5_K_M") |> 
  set_n_predict(1000) |> 
  set_temperature(0.8) |> 
  request_json_answer()
#> → Default IP address has been set to 127.0.0.1.
#> → Default port has been set to 11434.