Skip to contents

parse_json_result attempts to parse the JSON result from the LLM

Usage

parse_json_result(json_string)

Arguments

json_string

the JSON string you want to parse

Details

This function will assume that the result from the LLM is provided in a JSON format If the format is correct, it will parse the result as a R object. You would typically expect such a format if you used the request_json_answer() function. This is typically used in a pipe, after pull_final_answer().

Examples

myflow_template <- ai_workflow() |> 
set_connector("ollama")  |> 
  set_model(model_name= "llama3.1:8b-instruct-q5_K_M") |> 
  set_n_predict(1000) |> 
  set_temperature(0.8) |> 
  request_json_answer()
#> → Default IP address has been set to 127.0.0.1.
#> → Default port has been set to 11434.
  
myflow_template |> 
process_prompts("what is the usual color of the sky on Earth?") |>
pull_final_answer() |>
parse_json_result()
#> → Frequency Penalty was not specified and given a default value of 1.
#> → Presence Penalty was not specified and given a default value of 1.5.
#> → Repeat Penalty was not specified and given a default value of 1.2.
#> → Mode was not specified and 'chat' was selected by default.
#> → System Prompt was not specified and given a default value of 'You are a helpful AI assistant.'.
#> → Chat mode
#> $answer
#> [1] "blue"
#>