LLM Integration

Oracle Digital Assistant's integration of large language models (LLMs) enables you to enhance your skills with generative AI capabilities.

These capabilities include:

  • Handling small talk with a user.
  • Generating written summaries of data.
  • Automating challenging or repetitive business tasks, such as those required for talent acquisition.
  • Providing sentiment analysis of a given piece of text to determine whether the it reflects a positive, negative, or neutral opinion.

Using the Invoke Large Language Model component (the LLM component), you can plug these capabilities into your dialog flow wherever they're needed. This dialog flow component is the primary integration piece for generative AI in that it contacts the LLM through a REST call, then sends the LLM a prompt (the natural language instructions to the LLM) along with related parameters. It then returns the results generated by the model (which are also known as completions) and manages the state of the LLM-user interactions so that its responses remain in context after successive rounds of user queries and feedback. The LLM component can call any LLM. You can add one or more LLM component states (or LLM blocks) to flows. You can also chain the LLM calls so that the output of one LLM request can be passed to a subsequent LLM request.

Besides the LLM component, the other major pieces of LLM integration include endpoints for the LLM service provider, and transformation handlers for converting the request and reponse payloads to and from Digital Assistant's format, the Common LLM Interface (CLMI). Here are the high-level steps for adding these and other components to create the LLM integration for your skill:

  • Register an API service in your Digital Assistant instance for the LLM's REST endpoint.
  • For your skill, create a LLM Transformation Event Handler to convert the LLM request and response payloads to and from CLMI.
    Note

    We provide prebuilt handlers if you're integrating your skill with the Cohere model or with Oracle Generative AI Service. If you're accessing other models, such as Azure OpenAI, you can update the starter transformation code that we provide.
  • Define an LLM service for your skill that maps to the REST service that you have registered to the instance with an LLM Transformation Handler.
  • In the flow where you want to use the LLM, insert and configure an LLM component by adding your prompt text and setting other parameters.

    Tip:

    As a best practice, we recommend that you use the Prompt Builder (accessed through the LLM component) to perfect your prompt.
The following pages will guide you through the concepts behind and the steps for integrating an LLM service into your skill to create LLM blocks in your dialog flow.