CohereLlmInferenceResponse

class oci.generative_ai_inference.models.CohereLlmInferenceResponse(**kwargs)

Bases: oci.generative_ai_inference.models.llm_inference_response.LlmInferenceResponse

The generated text result to return.

Attributes

RUNTIME_TYPE_COHERE str(object=’’) -> str
RUNTIME_TYPE_LLAMA str(object=’’) -> str
generated_texts [Required] Gets the generated_texts of this CohereLlmInferenceResponse.
prompt Gets the prompt of this CohereLlmInferenceResponse.
runtime_type [Required] Gets the runtime_type of this LlmInferenceResponse.
time_created [Required] Gets the time_created of this CohereLlmInferenceResponse.

Methods

__init__(**kwargs) Initializes a new CohereLlmInferenceResponse object with values from keyword arguments.
get_subtype(object_dictionary) Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.
RUNTIME_TYPE_COHERE = 'COHERE'
RUNTIME_TYPE_LLAMA = 'LLAMA'
__init__(**kwargs)

Initializes a new CohereLlmInferenceResponse object with values from keyword arguments. The default value of the runtime_type attribute of this class is COHERE and it should not be changed. The following keyword arguments are supported (corresponding to the getters/setters of this class):

Parameters:
  • runtime_type (str) – The value to assign to the runtime_type property of this CohereLlmInferenceResponse. Allowed values for this property are: “COHERE”, “LLAMA”
  • generated_texts (list[oci.generative_ai_inference.models.GeneratedText]) – The value to assign to the generated_texts property of this CohereLlmInferenceResponse.
  • time_created (datetime) – The value to assign to the time_created property of this CohereLlmInferenceResponse.
  • prompt (str) – The value to assign to the prompt property of this CohereLlmInferenceResponse.
generated_texts

[Required] Gets the generated_texts of this CohereLlmInferenceResponse. Each prompt in the input array has an array of GeneratedText, controlled by numGenerations parameter in the request.

Returns:The generated_texts of this CohereLlmInferenceResponse.
Return type:list[oci.generative_ai_inference.models.GeneratedText]
static get_subtype(object_dictionary)

Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.

prompt

Gets the prompt of this CohereLlmInferenceResponse. Represents the original prompt. Applies only to non-stream responses.

Returns:The prompt of this CohereLlmInferenceResponse.
Return type:str
runtime_type

[Required] Gets the runtime_type of this LlmInferenceResponse. The runtime of the provided model.

Allowed values for this property are: “COHERE”, “LLAMA”, ‘UNKNOWN_ENUM_VALUE’. Any unrecognized values returned by a service will be mapped to ‘UNKNOWN_ENUM_VALUE’.

Returns:The runtime_type of this LlmInferenceResponse.
Return type:str
time_created

[Required] Gets the time_created of this CohereLlmInferenceResponse. The date and time that the model was created in an RFC3339 formatted datetime string.

Returns:The time_created of this CohereLlmInferenceResponse.
Return type:datetime